WorldWideScience

Sample records for dynamic signature verification

  1. FIR signature verification system characterizing dynamics of handwriting features

    Science.gov (United States)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  2. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  3. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  4. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  5. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  6. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  7. Cubic Bezier Curve Approach for Automated Offline Signature Verification with Intrusion Identification

    Directory of Open Access Journals (Sweden)

    Arun Vijayaragavan

    2014-01-01

    Full Text Available Authentication is a process of identifying person’s rights over a system. Many authentication types are used in various systems, wherein biometrics authentication systems are of a special concern. Signature verification is a basic biometric authentication technique used widely. The signature matching algorithm uses image correlation and graph matching technique which provides false rejection or acceptance. We proposed a model to compare knowledge from signature. Intrusion in the signature repository system results in copy of the signature that leads to false acceptance. Our approach uses a Bezier curve algorithm to identify the curve points and uses the behaviors of the signature for verification. An analyzing mobile agent is used to identify the input signature parameters and compare them with reference signature repository. It identifies duplication of signature over intrusion and rejects it. Experiments are conducted on a database with thousands of signature images from various sources and the results are favorable.

  8. Online Signature Verification on MOBISIG Finger-Drawn Signature Corpus

    Directory of Open Access Journals (Sweden)

    Margit Antal

    2018-01-01

    Full Text Available We present MOBISIG, a pseudosignature dataset containing finger-drawn signatures from 83 users captured with a capacitive touchscreen-based mobile device. The database was captured in three sessions resulting in 45 genuine signatures and 20 skilled forgeries for each user. The database was evaluated by two state-of-the-art methods: a function-based system using local features and a feature-based system using global features. Two types of equal error rate computations are performed: one using a global threshold and the other using user-specific thresholds. The lowest equal error rate was 0.01% against random forgeries and 5.81% against skilled forgeries using user-specific thresholds that were computed a posteriori. However, these equal error rates were significantly raised to 1.68% (random forgeries case and 14.31% (skilled forgeries case using global thresholds. The same evaluation protocol was performed on the DooDB publicly available dataset. Besides verification performance evaluations conducted on the two finger-drawn datasets, we evaluated the quality of the samples and the users of the two datasets using basic quality measures. The results show that finger-drawn signatures can be used by biometric systems with reasonable accuracy.

  9. Online Signature Verification using Recurrent Neural Network and Length-normalized Path Signature

    OpenAIRE

    Lai, Songxuan; Jin, Lianwen; Yang, Weixin

    2017-01-01

    Inspired by the great success of recurrent neural networks (RNNs) in sequential modeling, we introduce a novel RNN system to improve the performance of online signature verification. The training objective is to directly minimize intra-class variations and to push the distances between skilled forgeries and genuine samples above a given threshold. By back-propagating the training signals, our RNN network produced discriminative features with desired metrics. Additionally, we propose a novel d...

  10. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Science.gov (United States)

    2010-07-22

    ... Electronic Signature and Storage of Form I-9, Employment Eligibility Verification AGENCY: U.S. Immigration... published an interim final rule to permit electronic signature and storage of the Form I-9. 71 FR 34510... because electronic signature and storage technologies are optional, DHS expects that small entities will...

  11. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  12. Forged Signature Distinction Using Convolutional Neural Network for Feature Extraction

    Directory of Open Access Journals (Sweden)

    Seungsoo Nam

    2018-01-01

    Full Text Available This paper proposes a dynamic verification scheme for finger-drawn signatures in smartphones. As a dynamic feature, the movement of a smartphone is recorded with accelerometer sensors in the smartphone, in addition to the moving coordinates of the signature. To extract high-level longitudinal and topological features, the proposed scheme uses a convolution neural network (CNN for feature extraction, and not as a conventional classifier. We assume that a CNN trained with forged signatures can extract effective features (called S-vector, which are common in forging activities such as hesitation and delay before drawing the complicated part. The proposed scheme also exploits an autoencoder (AE as a classifier, and the S-vector is used as the input vector to the AE. An AE has high accuracy for the one-class distinction problem such as signature verification, and is also greatly dependent on the accuracy of input data. S-vector is valuable as the input of AE, and, consequently, could lead to improved verification accuracy especially for distinguishing forged signatures. Compared to the previous work, i.e., the MLP-based finger-drawn signature verification scheme, the proposed scheme decreases the equal error rate by 13.7%, specifically, from 18.1% to 4.4%, for discriminating forged signatures.

  13. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  14. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  15. On-power verification of the dynamic response of self-powered in-core detectors

    International Nuclear Information System (INIS)

    Serdula, K.; Beaudet, M.

    1996-01-01

    Self-powered in-core detectors are used for on-line safety and regulation purposes in CANDU reactors. Such applications require use of detectors whose response is primarily prompt to changes in flux. In-service verification of the detectors' response is required to ensure significant degradation in performance has not occurred during long-term operation. Changes in the detector characteristics occur due to nuclear interactions and failures. Present verification requires significant station resources and disrupts power production. Use of the 'noise' in the detector signal is being investigated as an alternative to assess the dynamic response of the detectors during long-term operation. Measurements of reference 'signatures' were obtained from replacement shutdown system detectors. Results show 'noise' measurements are a promising alternative to the current verification method. Identification of changes in the detector response function assist in accurate diagnosis and prognosis of changes in detector signals due to process changes. (author)

  16. The electronic identification, signature and security of information systems

    Directory of Open Access Journals (Sweden)

    Horovèák Pavel

    2002-12-01

    Full Text Available The contribution deals with the actual methods and technologies of information and communication systems security. It introduces the overview of electronic identification elements such as static password, dynamic password and single sign-on. Into this category belong also biometric and dynamic characteristics of verified person. Widespread is authentication based on identification elements ownership, such as various cards and authentication calculators. In the next part is specified a definition and characterization of electronic signature, its basic functions and certificate categories. Practical utilization of electronic signature consists of electronic signature acquirement, signature of outgoing email message, receiving of electronic signature and verification of electronic signature. The use of electronic signature is continuously growing and in connection with legislation development it exercises in all resorts.

  17. Online Signature Verification: To What Extent Should a Classifier be Trusted in?

    Directory of Open Access Journals (Sweden)

    Marianela Parodi

    2017-08-01

    Full Text Available To select the best features to model the signatures is one of the major challenges in the field of online signature verification. To combine different feature sets, selected by different criteria, is a useful technique to address this problem. In this line, the analysis of different features and their discriminative power has been researchers’ main concern, paying less attention to the way in which the different kind of features are combined. Moreover, the fact that conflicting results may appear when several classifiers are being used, has rarely been taken into account. In this paper, a score level fusion scheme is proposed to combine three different and meaningful feature sets, viz., an automatically selected feature set, a feature set relevant to Forensic Handwriting Experts (FHEs, and a global feature set. The score level fusion is performed within the framework of the Belief Function Theory (BFT, in order to address the problem of the conflicting results appearing when multiple classifiers are being used. Two different models, namely, the Denoeux and the Appriou models, are used to embed the problem within this framework, where the fusion is performed resorting to two well-known combination rules, namely, the Dempster-Shafer (DS and the Proportional Conflict Redistribution (PCR5 one. In order to analyze the robustness of the proposed score level fusion approach, the combination is performed for the same verification system using two different classification techniques, namely, Ramdon Forests (RF and Support Vector Machines (SVM. Experimental results, on a publicly available database, show that the proposed score level fusion approach allows the system to have a very good trade-off between verification results and reliability.

  18. Nonlinear analysis of dynamic signature

    Science.gov (United States)

    Rashidi, S.; Fallah, A.; Towhidkhah, F.

    2013-12-01

    Signature is a long trained motor skill resulting in well combination of segments like strokes and loops. It is a physical manifestation of complex motor processes. The problem, generally stated, is that how relative simplicity in behavior emerges from considerable complexity of perception-action system that produces behavior within an infinitely variable biomechanical and environmental context. To solve this problem, we present evidences which indicate that motor control dynamic in signing process is a chaotic process. This chaotic dynamic may explain a richer array of time series behavior in motor skill of signature. Nonlinear analysis is a powerful approach and suitable tool which seeks for characterizing dynamical systems through concepts such as fractal dimension and Lyapunov exponent. As a result, they can be analyzed in both horizontal and vertical for time series of position and velocity. We observed from the results that noninteger values for the correlation dimension indicates low dimensional deterministic dynamics. This result could be confirmed by using surrogate data tests. We have also used time series to calculate the largest Lyapunov exponent and obtain a positive value. These results constitute significant evidence that signature data are outcome of chaos in a nonlinear dynamical system of motor control.

  19. A New Approach for High Pressure Pixel Polar Distribution on Off-line Signature Verification

    Directory of Open Access Journals (Sweden)

    Jesús F. Vargas

    2010-06-01

    Full Text Available Features representing information of High Pressure Points froma static image of a handwritten signature are analyzed for an offline verification system. From grayscale images, a new approach for High Pressure threshold estimation is proposed. Two images, one containingthe High Pressure Points extracted and other with a binary version ofthe original signature, are transformed to polar coordinates where a pixel density ratio between them is calculated. Polar space had been divided into angular and radial segments, which permit a local analysis of the high pressure distribution. Finally two vectors containing the density distribution ratio are calculated for nearest and farthest points from geometric center of the original signature image. Experiments were carried out using a database containing signature from 160 individual. The robustness of the analyzed system for simple forgeries is tested out with Support Vector Machines models. For the sake of completeness, a comparison of the results obtained by the proposed approach with similar works published is presented.

  20. On the pinned field image binarization for signature generation in image ownership verification method

    Directory of Open Access Journals (Sweden)

    Chang Hsuan

    2011-01-01

    Full Text Available Abstract The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49 (9, 097005, 2010. While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81 (7, 1118-1129, 2008, the proposed scheme also has better performance.

  1. A Signature Comparing Android Mobile Application Utilizing Feature Extracting Algorithms

    Directory of Open Access Journals (Sweden)

    Paul Grafilon

    2017-08-01

    Full Text Available The paper presented one of the application that can be done using smartphones camera. Nowadays forgery is one of the most undetected crimes. With the forensic technology used today it is still difficult for authorities to compare and define what a real signature is and what a forged signature is. A signature is a legal representation of a person. All transactions are based on a signature. Forgers may use a signature to sign illegal contracts and withdraw from bank accounts undetected. A signature can also be forged during election periods for repeated voting. Addressing the issues a signature should always be secure. Signature verification is a reduced problem that still poses a real challenge for researchers. The literature on signature verification is quite extensive and shows two main areas of research off-line and on-line systems. Off-line systems deal with a static image of the signature i.e. the result of the action of signing while on-line systems work on the dynamic process of generating the signature i.e. the action of signing itself. The researchers have found a way to resolve the concerns. A mobile application that integrates the camera to take a picture of a signature analyzes it and compares it to other signatures for verification. It will exist to help citizens to be more cautious and aware with issues regarding the signatures. This might also be relevant to help organizations and institutions such as banks and insurance companies in verifying signatures that may avoid unwanted transactions and identity theft. Furthermore this might help the authorities in the never ending battle against crime especially against forgers and thieves. The project aimed to design and develop a mobile application that integrates the smartphone camera for verifying and comparing signatures for security using the best algorithm possible. As the result of the development the said smartphone camera application is functional and reliable.

  2. Analysis of signature wrapping attacks and countermeasures

    DEFF Research Database (Denmark)

    Gajek, Sebastian; Jensen, Meiko; Liao, Lijun

    2009-01-01

    In recent research it turned out that Boolean verification, of digital signatures in the context of WSSecurity, is likely to fail: If parts of a SOAP message, are signed and the signature verification applied to, the whole document returns true, then nevertheless the, document may have been...

  3. A Directed Signature Scheme and its Applications

    OpenAIRE

    Lal, Sunder; Kumar, Manoj

    2004-01-01

    This paper presents a directed signature scheme with the property that the signature can be verified only with the help of signer or signature receiver. We also propose its applications to share verification of signatures and to threshold cryptosystems.

  4. Early signatures of regime shifts in gene expression dynamics

    Science.gov (United States)

    Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani

    2013-06-01

    Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence.

  5. Early signatures of regime shifts in gene expression dynamics

    International Nuclear Information System (INIS)

    Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani

    2013-01-01

    Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence. (paper)

  6. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  7. Revocable identity-based proxy re-signature against signing key exposure.

    Science.gov (United States)

    Yang, Xiaodong; Chen, Chunlin; Ma, Tingchun; Wang, Jinli; Wang, Caifen

    2018-01-01

    Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification.

  8. Quantum blind dual-signature scheme without arbitrator

    International Nuclear Information System (INIS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-01-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology. (paper)

  9. Quantum blind dual-signature scheme without arbitrator

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  10. A Rational Threshold Signature Model and Protocol Based on Different Permissions

    Directory of Open Access Journals (Sweden)

    Bojun Wang

    2014-01-01

    Full Text Available This paper develops a novel model and protocol used in some specific scenarios, in which the participants of multiple groups with different permissions can finish the signature together. We apply the secret sharing scheme based on difference equation to the private key distribution phase and secret reconstruction phrase of our threshold signature scheme. In addition, our scheme can achieve the signature success because of the punishment strategy of the repeated rational secret sharing. Besides, the bit commitment and verification method used to detect players’ cheating behavior acts as a contributing factor to prevent the internal fraud. Using bit commitments, verifiable parameters, and time sequences, this paper constructs a dynamic game model, which has the features of threshold signature management with different permissions, cheat proof, and forward security.

  11. Un système de vérification de signature manuscrite en ligne basé ...

    African Journals Online (AJOL)

    Administrateur

    systems. The problem of cursive handwritten signatures verification can be approached on two main approaches one probabilistic. (analytical) and another structural. So, two methodologies ... do the classification is presented in [11].The online signature ..... Automatic on-line signature verification based on multiple models ...

  12. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  13. Quantum multi-signature protocol based on teleportation

    International Nuclear Information System (INIS)

    Wen Xiao-jun; Liu Yun; Sun Yu

    2007-01-01

    In this paper, a protocol which can be used in multi-user quantum signature is proposed. The scheme of signature and verification is based on the correlation of Greenberger-Horne-Zeilinger (GHZ) states and the controlled quantum teleportation. Different from the digital signatures, which are based on computational complexity, the proposed protocol has perfect security in the noiseless quantum channels. Compared to previous quantum signature schemes, this protocol can verify the signature independent of an arbitrator as well as realize multi-user signature together. (orig.)

  14. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  15. Signature-based store checking buffer

    Science.gov (United States)

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  16. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  17. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  18. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  19. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  20. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  1. Signature detection and matching for document image retrieval.

    Science.gov (United States)

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.

  2. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  3. A feature based comparison of pen and swipe based signature characteristics.

    Science.gov (United States)

    Robertson, Joshua; Guest, Richard

    2015-10-01

    Dynamic Signature Verification (DSV) is a biometric modality that identifies anatomical and behavioral characteristics when an individual signs their name. Conventionally signature data has been captured using pen/tablet apparatus. However, the use of other devices such as the touch-screen tablets has expanded in recent years affording the possibility of assessing biometric interaction on this new technology. To explore the potential of employing DSV techniques when a user signs or swipes with their finger, we report a study to correlate pen and finger generated features. Investigating the stability and correlation between a set of characteristic features recorded in participant's signatures and touch-based swipe gestures, a statistical analysis was conducted to assess consistency between capture scenarios. The results indicate that there is a range of static and dynamic features such as the rate of jerk, size, duration and the distance the pen traveled that can lead to interoperability between these two systems for input methods for use within a potential biometric context. It can be concluded that this data indicates that a general principle is that the same underlying constructional mechanisms are evident. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  5. Solar wind dynamic pressure variations and transient magnetospheric signatures

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Baumjohann, W.

    1989-01-01

    Contrary to the prevailing popular view, we find some transient ground events with bipolar north-south signatures are related to variations in solar wind dynamic pressure and not necessarily to magnetic merging. We present simultaneous solar wind plasma observations for two previously reported transient ground events observed at dayside auroral latitudes. During the first event, originally reported by Lanzerotti et al. [1987], conjugate ground magnetometers recorded north-south magetic field deflections in the east-west and vertical directions. The second event was reported by Todd et al. [1986], we noted ground rader observations indicating strong northward then southward ionospheric flows. The events were associated with the postulated signatures of patchy, sporadic, merging of magnetosheath and magnetospheric magnetic field lines at the dayside magnetospause, known as flux transfer events. Conversely, we demonstrate that the event reported by Lanzerotti et al. was accompanied by a sharp increase in solar wind dynamic pressure, a magnetospheric compression, and a consequent ringing of the magnetospheric magnetic field. The event reported by Todd et al. was associated with a brief but sharp increase in the solar wind dynamic pressure. copyright American Geophysical Union 1989

  6. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  7. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  8. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  9. Research on Signature Verification Method Based on Discrete Fréchet Distance

    Science.gov (United States)

    Fang, J. L.; Wu, W.

    2018-05-01

    This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.

  10. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  11. Extending the similarity-based XML multicast approach with digital signatures

    DEFF Research Database (Denmark)

    Azzini, Antonia; Marrara, Stefania; Jensen, Meiko

    2009-01-01

    This paper investigates the interplay between similarity-based SOAP message aggregation and digital signature application. An overview on the approaches resulting from the different orders for the tasks of signature application, verification, similarity aggregation and splitting is provided....... Depending on the intersection between similarity-aggregated and signed SOAP message parts, the paper discusses three different cases of signature application, and sketches their applicability and performance implications....

  12. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  13. Far-IR transparency and dynamic infrared signature control with novel conducting polymer systems

    Science.gov (United States)

    Chandrasekhar, Prasanna; Dooley, T. J.

    1995-09-01

    Materials which possess transparency, coupled with active controllability of this transparency in the infrared (IR), are today an increasingly important requirement, for varied applications. These applications include windows for IR sensors, IR-region flat panel displays used in camouflage as well as in communication and sight through night-vision goggles, coatings with dynamically controllable IR-emissivity, and thermal conservation coatings. Among stringent requirements for these applications are large dynamic ranges (color contrast), 'multi-color' or broad-band characteristics, extended cyclability, long memory retention, matrix addressability, small area fabricability, low power consumption, and environmental stability. Among materials possessing the requirements for variation of IR signature, conducting polymers (CPs) appear to be the only materials with dynamic, actively controllable signature and acceptable dynamic range. Conventional CPs such as poly(alkyl thiophene), poly(pyrrole) or poly(aniline) show very limited dynamic range, especially in the far-IR, while also showing poor transparency. We have developed a number of novel CP systems ('system' implying the CP, the selected dopant, the synthesis method, and the electrolyte) with very wide dynamic range (up to 90% in both important IR regions, 3 - 5 (mu) and 8 - 12 (mu) ), high cyclability (to 105 cycles with less than 10% optical degradation), nearly indefinite optical memory retention, matrix addressability of multi-pixel displays, very wide operating temperature and excellent environmental stability, low charge capacity, and processability into areas from less than 1 mm2 to more than 100 cm2. The criteria used to design and arrive at these CP systems, together with representative IR signature data, are presented in this paper.

  14. Improvement of a Quantum Proxy Blind Signature Scheme

    Science.gov (United States)

    Zhang, Jia-Lei; Zhang, Jian-Zhong; Xie, Shu-Cui

    2018-06-01

    Improvement of a quantum proxy blind signature scheme is proposed in this paper. Six-qubit entangled state functions as quantum channel. In our scheme, a trust party Trent is introduced so as to avoid David's dishonest behavior. The receiver David verifies the signature with the help of Trent in our scheme. The scheme uses the physical characteristics of quantum mechanics to implement message blinding, delegation, signature and verification. Security analysis proves that our scheme has the properties of undeniability, unforgeability, anonymity and can resist some common attacks.

  15. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  16. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    Science.gov (United States)

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  17. MERGER SIGNATURES IN THE DYNAMICS OF STAR-FORMING GAS

    International Nuclear Information System (INIS)

    Hung, Chao-Ling; Sanders, D. B.; Hayward, Christopher C.; Smith, Howard A.; Ashby, Matthew L. N.; Martínez-Galarza, Juan R.; Zezas, Andreas; Lanz, Lauranne

    2016-01-01

    The recent advent of integral field spectrographs and millimeter interferometers has revealed the internal dynamics of many hundreds of star-forming galaxies. Spatially resolved kinematics have been used to determine the dynamical status of star-forming galaxies with ambiguous morphologies, and constrain the importance of galaxy interactions during the assembly of galaxies. However, measuring the importance of interactions or galaxy merger rates requires knowledge of the systematics in kinematic diagnostics and the visible time with merger indicators. We analyze the dynamics of star-forming gas in a set of binary merger hydrodynamic simulations with stellar mass ratios of 1:1 and 1:4. We find that the evolution of kinematic asymmetries traced by star-forming gas mirrors morphological asymmetries derived from mock optical images, in which both merger indicators show the largest deviation from isolated disks during strong interaction phases. Based on a series of simulations with various initial disk orientations, orbital parameters, gas fractions, and mass ratios, we find that the merger signatures are visible for ∼0.2–0.4 Gyr with kinematic merger indicators but can be approximately twice as long for equal-mass mergers of massive gas-rich disk galaxies designed to be analogs of z ∼ 2–3 submillimeter galaxies. Merger signatures are most apparent after the second passage and before the black holes coalescence, but in some cases they persist up to several hundred Myr after coalescence. About 20%–60% of the simulated galaxies are not identified as mergers during the strong interaction phase, implying that galaxies undergoing violent merging process do not necessarily exhibit highly asymmetric kinematics in their star-forming gas. The lack of identifiable merger signatures in this population can lead to an underestimation of merger abundances in star-forming galaxies, and including them in samples of star-forming disks may bias the measurements of disk

  18. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  19. Un système de vérification de signature manuscrite en ligne basé ...

    African Journals Online (AJOL)

    Administrateur

    online handwritten signature verification system. We model the handwritten signature by an analytical approach based on the Empirical Mode Decomposition (EMD). The organized system is provided with a training module and a base of signatures. The implemented evaluation protocol points out the interest of the adopted ...

  20. Review and Analysis of Cryptographic Schemes Implementing Threshold Signature

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-03-01

    Full Text Available This work is devoted to the study of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, ellipt ic curves and bilinear pairings were investigated. Different methods of generation and verification of threshold signatures were explored, e.g. used in a mobile agents, Internet banking and e-currency. The significance of the work is determined by the reduction of the level of counterfeit electronic documents, signed by certain group of users.

  1. Quantum signature scheme for known quantum messages

    International Nuclear Information System (INIS)

    Kim, Taewan; Lee, Hyang-Sook

    2015-01-01

    When we want to sign a quantum message that we create, we can use arbitrated quantum signature schemes which are possible to sign for not only known quantum messages but also unknown quantum messages. However, since the arbitrated quantum signature schemes need the help of a trusted arbitrator in each verification of the signature, it is known that the schemes are not convenient in practical use. If we consider only known quantum messages such as the above situation, there can exist a quantum signature scheme with more efficient structure. In this paper, we present a new quantum signature scheme for known quantum messages without the help of an arbitrator. Differing from arbitrated quantum signature schemes based on the quantum one-time pad with the symmetric key, since our scheme is based on quantum public-key cryptosystems, the validity of the signature can be verified by a receiver without the help of an arbitrator. Moreover, we show that our scheme provides the functions of quantum message integrity, user authentication and non-repudiation of the origin as in digital signature schemes. (paper)

  2. A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.

    Science.gov (United States)

    Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce

    2018-01-01

    A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification. © 2017 American Academy of Forensic Sciences.

  3. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    Purpose: To verify the accuracy of conformal isodose distributions and absolute doses delivered with a dynamic IMR system. Methods and materials: 21 patients treated with advanced or recurrent disease with a dynamic IMR system, of which 13 were immobilized with head screws, and 8, with non-invasive plastic masks. The system included immobilization techniques, computerized tomography (CT), a dynamic pencil beam multileaf collimator (MLC), a collimator controller computer, collimator safety interlocks, a simulated annealing optimization implemented on a dedicated quad processing computer system, phantoms embedded with dosemeters, patient setup and dose delivery techniques, in vivo dose verification, and a comprehensive quality assurance program. The collimator consisted of a 2 x 20 array of Tungsten leaves, each programmable to be either fully open or shut, thus offering 2 40 beam patterns with cross sectional areas of up to 4 x 20 cm at the linear accelerator (linac) gantry rotational axis. Any of these patterns were dynamically changeable per degree sign of gantry rotation. An anthropomorphic phantom composed of transverse anatomic slabs helped simulate patient geometry relative to immobilization devices, fiducial markers, CT and treatment room lasers, and linac rotational axis. Before each treatment regimen, the compliance of measured to planned doses was tested in phantom irradiations using each patient's fiducial markers, immobilization system, anatomic positioning, and collimator sequencing. Films and thermoluminescent dosemeters (TLD) were embedded in the phantom to measure absolute doses and dose distributions. Because the planner didn't account for variable electron density distributions in head and neck targets, the air cavities of the anthropomorphic phantom were filled with tissue equivalent bolus. Optical density distributions of films exposed to the dynamic IMR of each patient were obtained with a Hurter-Driffield calibration curved based on films

  4. The Importance of Hydrological Signature and Its Recurring Dynamics

    Science.gov (United States)

    Wendi, D.; Marwan, N.; Merz, B.

    2017-12-01

    Temporal changes in hydrology are known to be challenging to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, and land use change, could impact variably on space-time scales and influence or mask each other. Besides, data depicting these drivers are often not available. One conventional approach of analyzing the change is based on discrete points of magnitude (e.g. the frequency of recurring extreme discharge) and often linearly quantified and hence do not reveal the potential change in the hydrological process. Moreover, discharge series are often subject to measurement errors, such as rating curve error especially in the case of flood peaks where observation are derived through extrapolation. In this study, the system dynamics inferred from the hydrological signature (i.e. the shape of hydrograph) is being emphasized. One example is to see if certain flood dynamics (instead of flood peak) in the recent years, had also occurred in the past (or rather extraordinary), and if so what is its recurring rate and if there had been a shift in its occurrence in time or seasonality (e.g. earlier snow melt dominant flood). The utilization of hydrological signature here is extended beyond those of classical hydrology such as base flow index, recession and rising limb slope, and time to peak. It is in fact all these characteristics combined i.e. from the start until the end of the hydrograph. Recurrence plot is used as a method to quantify and visualize the recurring hydrological signature through its phase space trajectories, and usually in the order of dimension above 2. Such phase space trajectories are constructed by embedding the time series into a series of variables (i.e. number of dimension) corresponding to the time delay. Since the method is rather novel in

  5. Signatures of discrete breathers in coherent state quantum dynamics

    International Nuclear Information System (INIS)

    Igumenshchev, Kirill; Ovchinnikov, Misha; Prezhdo, Oleg; Maniadis, Panagiotis

    2013-01-01

    In classical mechanics, discrete breathers (DBs) – a spatial time-periodic localization of energy – are predicted in a large variety of nonlinear systems. Motivated by a conceptual bridging of the DB phenomena in classical and quantum mechanical representations, we study their signatures in the dynamics of a quantum equivalent of a classical mechanical point in phase space – a coherent state. In contrast to the classical point that exhibits either delocalized or localized motion, the coherent state shows signatures of both localized and delocalized behavior. The transition from normal to local modes have different characteristics in quantum and classical perspectives. Here, we get an insight into the connection between classical and quantum perspectives by analyzing the decomposition of the coherent state into system's eigenstates, and analyzing the spacial distribution of the wave-function density within these eigenstates. We find that the delocalized and localized eigenvalue components of the coherent state are separated by a mixed region, where both kinds of behavior can be observed. Further analysis leads to the following observations. Considered as a function of coupling, energy eigenstates go through avoided crossings between tunneling and non-tunneling modes. The dominance of tunneling modes in the high nonlinearity region is compromised by the appearance of new types of modes – high order tunneling modes – that are similar to the tunneling modes but have attributes of non-tunneling modes. Certain types of excitations preferentially excite higher order tunneling modes, allowing one to study their properties. Since auto-correlation functions decrease quickly in highly nonlinear systems, short-time dynamics are sufficient for modeling quantum DBs. This work provides a foundation for implementing modern semi-classical methods to model quantum DBs, bridging classical and quantum mechanical signatures of DBs, and understanding spectroscopic experiments

  6. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  7. Modeling the lexical morphology of Western handwritten signatures.

    Directory of Open Access Journals (Sweden)

    Moises Diaz-Cabrera

    Full Text Available A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures.

  8. Possibilities of dynamic biometrics for authentication and the circumstances for using dynamic biometric signature

    Directory of Open Access Journals (Sweden)

    Frantisek Hortai

    2018-01-01

    Full Text Available New information technologies alongside their benefits also bring new dangers with themselves. It is difficult to decide which authentication tool to use and implement in the information systems and electronic documents. The final decision has to compromise among the facts that it faces several conflicting requirements: highly secure tool, to be a user-friendly and user simplicity method, ensure protection against errors and failures of users, speed of authentication and provide these features for a reasonable price. Even when the compromised solution is found it has to fulfill the given technology standards. For the listed reasons the paper argues one of the most natural biometric authentication method the dynamic biometric signature and lists its related standards. The paper also includes measurement evaluation which solves the independence between the person’s signature and device on which it was created

  9. Continuous-variable quantum homomorphic signature

    Science.gov (United States)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  10. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and Discretization

    Directory of Open Access Journals (Sweden)

    Wai Kuan Yip

    2007-01-01

    Full Text Available We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT and discrete fourier transform (DFT. Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs of and for random and skilled forgeries for stolen token (worst case scenario, and for both forgeries in the genuine token (optimal scenario.

  11. A Quantum Multi-Proxy Weak Blind Signature Scheme Based on Entanglement Swapping

    Science.gov (United States)

    Yan, LiLi; Chang, Yan; Zhang, ShiBin; Han, GuiHua; Sheng, ZhiWei

    2017-02-01

    In this paper, we present a multi-proxy weak blind signature scheme based on quantum entanglement swapping of Bell states. In the scheme, proxy signers can finish the signature instead of original singer with his/her authority. It can be applied to the electronic voting system, electronic paying system, etc. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. It could guarantee not only the unconditionally security but also the anonymity of the message owner. The security analysis shows the scheme satisfies the security features of multi-proxy weak signature, singers cannot disavowal his/her signature while the signature cannot be forged by others, and the message owner can be traced.

  12. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  13. Constrained structural dynamic model verification using free vehicle suspension testing methods

    Science.gov (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  14. Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequences and Doppler Signatures.

    Science.gov (United States)

    Zhou, Zhi; Cao, Zongjie; Pi, Yiming

    2017-12-21

    The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar.

  15. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  16. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  17. Dynamic simulator for nuclear power plants (DSNP): development, verification, and expansion of modules

    International Nuclear Information System (INIS)

    Larson, H.A.; Dean, E.M.; Koenig, J.F.; Gale, J.G.; Lehto, W.K.

    1984-01-01

    The DSNP Simulation Language facilitates whole reactor plant simulation and design. Verification includes DSNP dynamic modeling of Experimental Breeder Reactor No. 2 (EBR-II) plant experiments as well as comparisons with verified simulation programs. Great flexibility is allowed in expanding the DSNP language and accommodate other computer languages. The component modules of DSNP, contained in libraries, are continually updated with new, improved, and verified modules. The modules are used to simulate the dynamic response of LMFBR reactor systems to upset and transient conditions, with special emphasis on investigations of inherent shutdown mechanisms

  18. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  19. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  20. Approaches to determining the reliability of a multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2018-03-01

    Full Text Available The market of modern mobile applications has increasingly strict requirements for the authentication system reliability. This article examines an authentication method using a multimodal three-dimensional dynamic signature (MTDS, that can be used both as a main and additional method of user authentication in mobile applications. It is based on the use of gesture in the air performed by two independent mobile devices as an identifier. The MTDS method has certain advantages over currently used biometric methods, including fingerprint authentication, face recognition and voice recognition. A multimodal three-dimensional dynamic signature allows quickly changing an authentication gesture, as well as concealing the authentication procedure using gestures that do not attract attention. Despite all its advantages, the MTDS method has certain limitations, the main one is building functionally dynamic complex (FDC skills required for accurate repeating an authentication gesture. To correctly create MTDS need to have a system for assessing the reliability of gestures. Approaches to the solution of this task are grouped in this article according to methods of their implementation. Two of the approaches can be implemented only with the use of a server as a centralized MTDS processing center and one approach can be implemented using smartphone's own computing resources. The final part of the article provides data of testing one of these methods on a template performing the MTDS authentication.

  1. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    Science.gov (United States)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  2. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, SJ.

    2005-01-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3 D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings

  3. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  4. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  5. Electronic health records: what does your signature signify?

    Directory of Open Access Journals (Sweden)

    Victoroff MD Michael S

    2012-08-01

    Full Text Available Abstract Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information.

  6. Single Molecule Cluster Analysis Identifies Signature Dynamic Conformations along the Splicing Pathway

    Science.gov (United States)

    Blanco, Mario R.; Martin, Joshua S.; Kahlscheuer, Matthew L.; Krishnan, Ramya; Abelson, John; Laederach, Alain; Walter, Nils G.

    2016-01-01

    The spliceosome is the dynamic RNA-protein machine responsible for faithfully splicing introns from precursor messenger RNAs (pre-mRNAs). Many of the dynamic processes required for the proper assembly, catalytic activation, and disassembly of the spliceosome as it acts on its pre-mRNA substrate remain poorly understood, a challenge that persists for many biomolecular machines. Here, we developed a fluorescence-based Single Molecule Cluster Analysis (SiMCAn) tool to dissect the manifold conformational dynamics of a pre-mRNA through the splicing cycle. By clustering common dynamic behaviors derived from selectively blocked splicing reactions, SiMCAn was able to identify signature conformations and dynamic behaviors of multiple ATP-dependent intermediates. In addition, it identified a conformation adopted late in splicing by a 3′ splice site mutant, invoking a mechanism for substrate proofreading. SiMCAn presents a novel framework for interpreting complex single molecule behaviors that should prove widely useful for the comprehensive analysis of a plethora of dynamic cellular machines. PMID:26414013

  7. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  8. Quantum Digital Signatures for Unconditional Safe Authenticity Protection of Medical Documentation

    Directory of Open Access Journals (Sweden)

    Arkadiusz Liber

    2015-12-01

    Full Text Available Modern medical documentation appears most often in an online form which requires some digital methods to ensure its confidentiality, integrity and authenticity. The document authenticity may be secured with the use of a signature. A classical handwritten signature is directly related to its owner by his/her psychomotor character traits. Such a signature is also connected with the material it is written on, and a writing tool. Because of these properties, a handwritten signature reflects certain close material bonds between the owner and the document. In case of modern digital signatures, the document authentication has a mathematical nature. The verification of the authenticity becomes the verification of a key instead of a human. Since 1994 it has been known that classical digital signature algorithms may not be safe because of the Shor’s factorization algorithm. To implement the modern authenticity protection of medical data, some new types of algorithms should be used. One of the groups of such algorithms is based on the quantum computations. In this paper, the analysis of the current knowledge status of Quantum Digital Signature protocols, with its basic principles, phases and common elements such as transmission, comparison and encryption, was outlined. Some of the most promising protocols for signing digital medical documentation, that fulfill the requirements for QDS, were also briefly described. We showed that, a QDS protocol with QKD components requires the equipment similar to the equipment used for a QKD, for its implementation, which is already commercially available. If it is properly implemented, it provides the shortest lifetime of qubits in comparison to other protocols. It can be used not only to sign classical messages but probably it could be well adopted to implement unconditionally safe protection of medical documentation in the nearest future, as well.

  9. Massive Black Hole Binaries: Dynamical Evolution and Observational Signatures

    Directory of Open Access Journals (Sweden)

    M. Dotti

    2012-01-01

    Full Text Available The study of the dynamical evolution of massive black hole pairs in mergers is crucial in the context of a hierarchical galaxy formation scenario. The timescales for the formation and the coalescence of black hole binaries are still poorly constrained, resulting in large uncertainties in the expected rate of massive black hole binaries detectable in the electromagnetic and gravitational wave spectra. Here, we review the current theoretical understanding of the black hole pairing in galaxy mergers, with a particular attention to recent developments and open issues. We conclude with a review of the expected observational signatures of massive binaries and of the candidates discussed in literature to date.

  10. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  11. Thin accretion disk signatures in dynamical Chern-Simons-modified gravity

    International Nuclear Information System (INIS)

    Harko, Tiberiu; Kovacs, Zoltan; Lobo, Francisco S N

    2010-01-01

    A promising extension of general relativity is Chern-Simons (CS)-modified gravity, in which the Einstein-Hilbert action is modified by adding a parity-violating CS term, which couples to gravity via a scalar field. In this work, we consider the interesting, yet relatively unexplored, dynamical formulation of CS-modified gravity, where the CS coupling field is treated as a dynamical field, endowed with its own stress-energy tensor and evolution equation. We consider the possibility of observationally testing dynamical CS-modified gravity by using the accretion disk properties around slowly rotating black holes. The energy flux, temperature distribution, the emission spectrum as well as the energy conversion efficiency are obtained, and compared to the standard general relativistic Kerr solution. It is shown that the Kerr black hole provides a more efficient engine for the transformation of the energy of the accreting mass into radiation than their slowly rotating counterparts in CS-modified gravity. Specific signatures appear in the electromagnetic spectrum, thus leading to the possibility of directly testing CS-modified gravity by using astrophysical observations of the emission spectra from accretion disks.

  12. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  13. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  14. Multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2017-11-01

    Full Text Available Reliable authentication in mobile applications is among the most important information security challenges. Today, we can hardly imagine a person who would not own a mobile device that connects to the Internet. Mobile devices are being used to store large amounts of confidential information, ranging from personal photos to electronic banking tools. In 2009, colleagues from Rice University together with their collaborators from Motorola, proposed an authentication through in-air gestures. This and subsequent work contributing to the development of the method are reviewed in our introduction. At the moment, there exists a version of the gesture-based authentication software available for Android mobile devices. This software has not become widespread yet. One of likely reasons for that is the insufficient reliability of the method, which involves similar to its earlier analogs the use of only one device. Here we discuss the authentication based on the multimodal three-dimensional dynamic signature (MTDS performed by two independent mobile devices. The MTDS-based authentication technique is an advanced version of in-air gesture authentication. We describe the operation of a prototype of MTDS-based authentication, including the main implemented algorithms, as well as some preliminary results of testing the software. We expect that our method can be used in any mobile application, provided a number of additional improvements discussed in the conclusion are made.

  15. Atomic-scale structural signature of dynamic heterogeneities in metallic liquids

    Science.gov (United States)

    Pasturel, Alain; Jakse, Noel

    2017-08-01

    With sufficiently high cooling rates, liquids will cross their equilibrium melting temperatures and can be maintained in a metastable undercooled state before solidifying. Studies of undercooled liquids reveal several intriguing dynamic phenomena and because explicit connections between liquid structure and liquids dynamics are difficult to identify, it remains a major challenge to capture the underlying structural link to these phenomena. Ab initio molecular dynamics (AIMD) simulations are yet especially powerful in providing atomic-scale details otherwise not accessible in experiments. Through the AIMD-based study of Cr additions in Al-based liquids, we evidence for the first time a close relationship between the decoupling of component diffusion and the emergence of dynamic heterogeneities in the undercooling regime. In addition, we demonstrate that the origin of both phenomena is related to a structural heterogeneity caused by a strong interplay between chemical short-range order (CSRO) and local fivefold topology (ISRO) at the short-range scale in the liquid phase that develops into an icosahedral-based medium-range order (IMRO) upon undercooling. Finally, our findings reveal that this structural signature is also captured in the temperature dependence of partial pair-distribution functions which opens up the route to more elaborated experimental studies.

  16. Network-based Arbitrated Quantum Signature Scheme with Graph State

    Science.gov (United States)

    Ma, Hongling; Li, Fei; Mao, Ningyi; Wang, Yijun; Guo, Ying

    2017-08-01

    Implementing an arbitrated quantum signature(QAS) through complex networks is an interesting cryptography technology in the literature. In this paper, we propose an arbitrated quantum signature for the multi-user-involved networks, whose topological structures are established by the encoded graph state. The determinative transmission of the shared keys, is enabled by the appropriate stabilizers performed on the graph state. The implementation of this scheme depends on the deterministic distribution of the multi-user-shared graph state on which the encoded message can be processed in signing and verifying phases. There are four parties involved, the signatory Alice, the verifier Bob, the arbitrator Trent and Dealer who assists the legal participants in the signature generation and verification. The security is guaranteed by the entanglement of the encoded graph state which is cooperatively prepared by legal participants in complex quantum networks.

  17. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  18. Verification of experimental modal modeling using HDR (Heissdampfreaktor) dynamic test data

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1983-01-01

    Experimental modal modeling involves the determination of the modal parameters of the model of a structure from recorded input-output data from dynamic tests. Though commercial modal analysis algorithms are being widely used in many industries their ability to identify a set of reliable modal parameters of an as-built nuclear power plant structure has not been systematically verified. This paper describes the effort to verify MODAL-PLUS, a widely used modal analysis code, using recorded data from the dynamic tests performed on the reactor building of the Heissdampfreaktor, situated near Frankfurt, Federal Republic of Germany. In the series of dynamic tests on HDR in 1979, the reactor building was subjected to forced vibrations from different types and levels of dynamic excitations. Two sets of HDR containment building input-output data were chosen for MODAL-PLUS analyses. To reduce the influence of nonlinear behavior on the results, these sets were chosen so that the levels of excitation are relatively low and about the same in the two sets. The attempted verification was only partially successful in that only one modal model, with a limited range of validity, could be synthesized and in that the goodness of fit could be verified only in this limited range

  19. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.

    Science.gov (United States)

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.

  20. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  1. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  2. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  3. Privacy in wireless sensor networks using ring signature

    Directory of Open Access Journals (Sweden)

    Ashmita Debnath

    2014-07-01

    Full Text Available The veracity of a message from a sensor node must be verified in order to avoid a false reaction by the sink. This verification requires the authentication of the source node. The authentication process must also preserve the privacy such that the node and the sensed object are not endangered. In this work, a ring signature was proposed to authenticate the source node while preserving its spatial privacy. However, other nodes as signers and their numbers must be chosen to preclude the possibility of a traffic analysis attack by an adversary. The spatial uncertainty increases with the number of signers but requires larger memory size and communication overhead. This requirement can breach the privacy of the sensed object. To determine the effectiveness of the proposed scheme, the location estimate of a sensor node by an adversary and enhancement in the location uncertainty with a ring signature was evaluated. Using simulation studies, the ring signature was estimated to require approximately four members from the same neighbor region of the source node to sustain the privacy of the node. Furthermore, the ring signature was also determined to have a small overhead and not to adversely affect the performance of the sensor network.

  4. Characterizing the anthropogenic signature in the LCLU dynamics in the Central Asia region

    Science.gov (United States)

    Tatarskii, V.; Sokolik, I. N.; de Beurs, K.; Shiklomanov, A. I.

    2017-12-01

    Humans have been changing the LCLU dynamics over time through the world. In the Central Asia region, these changes have been especially pronounced due to the political and economic transformation. We present a detailed analysis, focusing on identifying and quantifying the anthropogenic signature in the water and land use across the region. We have characterized the anthropogenic dust emission by combining the modeling and observations. The model is a fully coupled model called WRF-Chem-DuMo that takes explicitly into account the vegetation treatment in modeling the dust emission. We have reconstructed the anthropogenic dust sources in the region, such as the retreat of the Aral Sea, changes in agricultural fields, etc. In addition, we characterize the anthropogenic water use dynamics, including the changes in the water use for the agricultural production. Furthermore, we perform an analysis to identify the anthropogenic signature in the NDVI pattern. The NDVI were analyzed in conjunction with the meteorological fields that were simulated at the high special resolution using the WRF model. Meteorological fields of precipitation and temperature were used for the correlation analysis to separate the natural vs. anthropogenic changes. In this manner, we were able to identify the regions that have been affected by human activities. We will present the quantitative assessment of the anthropogenic changes. The diverse consequences for the economy of the region, as well as, the environment will be addressed.

  5. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  6. Hybrid Enrichment Verification Array: Module Characterization Studies

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  7. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, S.J.; University of Newcastle, NSW

    2004-01-01

    Full text: Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. The enhanced dynamic wedge factor (EDWF) presents some significant problems in accurate MU calculation, particularly in the case of non centre of field position (COF). This paper describes development of an independent MU program, concentrating on the implementation of the EDW component. The difficult case of non COF points under the EDW was studied in detail. A survey of Australasian centres regarding the use of independent MU check systems was conducted. The MUCalculator was developed with reference to MU calculations made by Pinnacle 3D RTP system (Philips) for 4MV, 6MV and 18MV X-ray beams from Varian machines used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. Ionisation chamber measurements in solid water TM and liquid water were performed based on a published test data set. Published algorithms combined with a depth dependent profile correction were applied in an attempt to match measured data with maximum accuracy. The survey results are presented. Substantial data is presented in tabular form and extensive comparison with published data. Several different methods for calculating EDWF are examined. A small systematic error was detected in the Gibbon equation used for the EDW calculations. Generally, calculations were within +2% of measured values, although some setups exceeded this variation. Results indicate that COF

  8. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  9. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  10. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  11. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification.

    Science.gov (United States)

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm(2). Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm(2) and compared with ion chamber data. Scanditronix/Wellhofer OmniPro(TM) IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm(2) at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles

  12. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  13. Design and Implementation of a Mobile Voting System Using a Novel Oblivious and Proxy Signature

    Directory of Open Access Journals (Sweden)

    Shin-Yan Chiou

    2017-01-01

    Full Text Available Electronic voting systems can make the voting process much more convenient. However, in such systems, if a server signs blank votes before users vote, it may cause undue multivoting. Furthermore, if users vote before the signing of the server, voting information will be leaked to the server and may be compromised. Blind signatures could be used to prevent leaking voting information from the server; however, malicious users could produce noncandidate signatures for illegal usage at that time or in the future. To overcome these problems, this paper proposes a novel oblivious signature scheme with a proxy signature function to satisfy security requirements such as information protection, personal privacy, and message verification and to ensure that no one can cheat other users (including the server. We propose an electronic voting system based on the proposed oblivious and proxy signature scheme and implement this scheme in a smartphone application to allow users to vote securely and conveniently. Security analyses and performance comparisons are provided to show the capability and efficiency of the proposed scheme.

  14. Modeling ground vehicle acoustic signatures for analysis and synthesis

    International Nuclear Information System (INIS)

    Haschke, G.; Stanfield, R.

    1995-01-01

    Security and weapon systems use acoustic sensor signals to classify and identify moving ground vehicles. Developing robust signal processing algorithms for this is expensive, particularly in presence of acoustic clutter or countermeasures. This paper proposes a parametric ground vehicle acoustic signature model to aid the system designer in understanding which signature features are important, developing corresponding feature extraction algorithms and generating low-cost, high-fidelity synthetic signatures for testing. The authors have proposed computer-generated acoustic signatures of armored, tracked ground vehicles to deceive acoustic-sensored smart munitions. They have developed quantitative measures of how accurately a synthetic acoustic signature matches those produced by actual vehicles. This paper describes parameters of the model used to generate these synthetic signatures and suggests methods for extracting these parameters from signatures of valid vehicle encounters. The model incorporates wide-bandwidth and narrow- bandwidth components that are modulated in a pseudo-random fashion to mimic the time dynamics of valid vehicle signatures. Narrow- bandwidth feature extraction techniques estimate frequency, amplitude and phase information contained in a single set of narrow frequency- band harmonics. Wide-bandwidth feature extraction techniques estimate parameters of a correlated-noise-floor model. Finally, the authors propose a method of modeling the time dynamics of the harmonic amplitudes as a means adding necessary time-varying features to the narrow-bandwidth signal components. The authors present results of applying this modeling technique to acoustic signatures recorded during encounters with one armored, tracked vehicle. Similar modeling techniques can be applied to security systems

  15. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  16. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  17. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  18. SIMMER-III code-verification. Phase 1

    International Nuclear Information System (INIS)

    Maschek, W.

    1996-05-01

    SIMMER-III is a computer code to investigate core disruptive accidents in liquid metal fast reactors but should also be used to investigate safety related problems in other types of advanced reactors. The code is developed by PNC with cooperation of the European partners FZK, CEA and AEA-T. SIMMER-III is a two-dimensional, three-velocity-field, multiphase, multicomponent, Eulerian, fluid-dynamics code coupled with a space-, time-, and energy-dependent neutron dynamics model. In order to model complex flow situations in a postulated disrupting core, mass and energy conservation equations are solved for 27 density components and 16 energy components, respectively. Three velocity fields (two liquid and one vapor) are modeled to simulate the relative motion of different fluid components. An additional static field takes into account the structures available in a reactor (pins, hexans, vessel structures, internal structures etc.). The neutronics is based on the discrete ordinate method (S N method) coupled into a quasistatic dynamic model. The code assessment and verification of the fluid dynamic/thermohydraulic parts of the code is performed in several steps in a joint effort of all partners. The results of the FZK contributions to the first assessment and verification phase is reported. (orig.) [de

  19. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  20. Real-time scene and signature generation for ladar and imaging sensors

    Science.gov (United States)

    Swierkowski, Leszek; Christie, Chad L.; Antanovskii, Leonid; Gouthas, Efthimios

    2014-05-01

    This paper describes development of two key functionalities within the VIRSuite scene simulation program, broadening its scene generation capabilities and increasing accuracy of thermal signatures. Firstly, a new LADAR scene generation module has been designed. It is capable of simulating range imagery for Geiger mode LADAR, in addition to the already existing functionality for linear mode systems. Furthermore, a new 3D heat diffusion solver has been developed within the VIRSuite signature prediction module. It is capable of calculating the temperature distribution in complex three-dimensional objects for enhanced dynamic prediction of thermal signatures. With these enhancements, VIRSuite is now a robust tool for conducting dynamic simulation for missiles with multi-mode seekers.

  1. Dynamic oscillatory signatures of central neuropathic pain in spinal cord injury.

    Science.gov (United States)

    Vuckovic, Aleksandra; Hasan, Muhammad A; Fraser, Matthew; Conway, Bernard A; Nasseroleslami, Bahman; Allan, David B

    2014-06-01

    Central neuropathic pain (CNP) is believed to be accompanied by increased activation of the sensorimotor cortex. Our knowledge of this interaction is based mainly on functional magnetic resonance imaging studies, but there is little direct evidence on how these changes manifest in terms of dynamic neuronal activity. This study reports on the presence of transient electroencephalography (EEG)-based measures of brain activity during motor imagery in spinal cord-injured patients with CNP. We analyzed dynamic EEG responses during imaginary movements of arms and legs in 3 groups of 10 volunteers each, comprising able-bodied people, paraplegic patients with CNP (lower abdomen and legs), and paraplegic patients without CNP. Paraplegic patients with CNP had increased event-related desynchronization in the theta, alpha, and beta bands (16-24 Hz) during imagination of movement of both nonpainful (arms) and painful limbs (legs). Compared to patients with CNP, paraplegics with no pain showed a much reduced power in relaxed state and reduced event-related desynchronization during imagination of movement. Understanding these complex dynamic, frequency-specific activations in CNP in the absence of nociceptive stimuli could inform the design of interventional therapies for patients with CNP and possibly further understanding of the mechanisms involved. This study compares the EEG activity of spinal cord-injured patients with CNP to that of spinal cord-injured patients with no pain and also to that of able-bodied people. The study shows that the presence of CNP itself leads to frequency-specific EEG signatures that could be used to monitor CNP and inform neuromodulatory treatments of this type of pain. Copyright © 2014 American Pain Society. Published by Elsevier Inc. All rights reserved.

  2. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  3. Criticality meets learning: Criticality signatures in a self-organizing recurrent neural network.

    Science.gov (United States)

    Del Papa, Bruno; Priesemann, Viola; Triesch, Jochen

    2017-01-01

    Many experiments have suggested that the brain operates close to a critical state, based on signatures of criticality such as power-law distributed neuronal avalanches. In neural network models, criticality is a dynamical state that maximizes information processing capacities, e.g. sensitivity to input, dynamical range and storage capacity, which makes it a favorable candidate state for brain function. Although models that self-organize towards a critical state have been proposed, the relation between criticality signatures and learning is still unclear. Here, we investigate signatures of criticality in a self-organizing recurrent neural network (SORN). Investigating criticality in the SORN is of particular interest because it has not been developed to show criticality. Instead, the SORN has been shown to exhibit spatio-temporal pattern learning through a combination of neural plasticity mechanisms and it reproduces a number of biological findings on neural variability and the statistics and fluctuations of synaptic efficacies. We show that, after a transient, the SORN spontaneously self-organizes into a dynamical state that shows criticality signatures comparable to those found in experiments. The plasticity mechanisms are necessary to attain that dynamical state, but not to maintain it. Furthermore, onset of external input transiently changes the slope of the avalanche distributions - matching recent experimental findings. Interestingly, the membrane noise level necessary for the occurrence of the criticality signatures reduces the model's performance in simple learning tasks. Overall, our work shows that the biologically inspired plasticity and homeostasis mechanisms responsible for the SORN's spatio-temporal learning abilities can give rise to criticality signatures in its activity when driven by random input, but these break down under the structured input of short repeating sequences.

  4. Studies on plant dynamics of sodium-cooled fast breeder reactors - verification of a plant model

    International Nuclear Information System (INIS)

    Schubert, B.

    1988-01-01

    For the analysis of sodium-cooled FBR safety and dynamics theoretical models are used, which have to be verified. In this report the verification of the plant model SSC-L is conducted by the comparison of calculated data with measurements of the experimental reactors KNK II and RAPSODIE. For this the plant model is extended and adapted. In general only small differences between calculated and measured data are recognized. The results are used to improve and complete the plant model. The extensions of the plant model applicability are used for the calculation of a loss of heat sink transient with reactor scram, considering pipes as passive heat sinks. (orig./HP) With 69 figs., 10 tabs [de

  5. Entanglement as a signature of quantum chaos.

    Science.gov (United States)

    Wang, Xiaoguang; Ghose, Shohini; Sanders, Barry C; Hu, Bambi

    2004-01-01

    We explore the dynamics of entanglement in classically chaotic systems by considering a multiqubit system that behaves collectively as a spin system obeying the dynamics of the quantum kicked top. In the classical limit, the kicked top exhibits both regular and chaotic dynamics depending on the strength of the chaoticity parameter kappa in the Hamiltonian. We show that the entanglement of the multiqubit system, considered for both the bipartite and the pairwise entanglement, yields a signature of quantum chaos. Whereas bipartite entanglement is enhanced in the chaotic region, pairwise entanglement is suppressed. Furthermore, we define a time-averaged entangling power and show that this entangling power changes markedly as kappa moves the system from being predominantly regular to being predominantly chaotic, thus sharply identifying the edge of chaos. When this entangling power is averaged over all states, it yields a signature of global chaos. The qualitative behavior of this global entangling power is similar to that of the classical Lyapunov exponent.

  6. Signature change events: a challenge for quantum gravity?

    International Nuclear Information System (INIS)

    White, Angela; Weinfurtner, Silke; Visser, Matt

    2010-01-01

    Within the framework of either Euclidean (functional integral) quantum gravity or canonical general relativity the signature of the manifold is a priori unconstrained. Furthermore, recent developments in the emergent spacetime programme have led to a physically feasible implementation of (analogue) signature change events. This suggests that it is time to revisit the sometimes controversial topic of signature change in general relativity. Specifically, we shall focus on the behaviour of a quantum field defined on a manifold containing regions of different signature. We emphasize that regardless of the underlying classical theory, there are severe problems associated with any quantum field theory residing on a signature-changing background. (Such as the production of what is naively an infinite number of particles, with an infinite energy density.) We show how the problem of quantum fields exposed to finite regions of Euclidean-signature (Riemannian) geometry has similarities with the quantum barrier penetration problem. Finally we raise the question as to whether signature change transitions could be fully understood and dynamically generated within (modified) classical general relativity, or whether they require the knowledge of a theory of quantum gravity.

  7. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  8. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  9. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  10. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  11. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  12. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  13. Remote sensing and geoinformation technologies in support of nuclear non-proliferation and arms control verification regimes

    Energy Technology Data Exchange (ETDEWEB)

    Niemeyer, Irmgard [Forschungszentrum Juelich GmbH, Institut fuer Energie- und Klimaforschung, IEK-6: Nukleare Entsorgung und Reaktorsicherheit (Germany)

    2013-07-01

    A number of international agreements and export control regimes have been concluded in order to reduce the risk and proliferation of weapons of mass destruction. In order to provide confidence that Member States are complying with the agreed commitments, most of the treaties and agreements include verification provisions. Different types of verification measures exist, e.g. cooperative measures; national technical means; technical monitoring or measurement devices placed at or near sites; on-site inspections; intelligence information; open-source information, such as commercial internet data and satellite imagery. The study reviews the technical progress in the field of satellite imaging sensors and explores the recent advances in satellite imagery processing and geoinformation technologies as to the extraction of significant observables and signatures. Moreover, it discusses how satellite data and geoinformation technologies could be used complementary for confirming information gathered from other systems or sources. The study also aims at presenting the legal and political aspects and the cost benefits of using imagery from both national and commercial satellites in the verification procedure. The study concludes that satellite imagery and geoinformation technologies are expected to enhance the verification efficiency and effectiveness.

  14. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans

    Science.gov (United States)

    Michelmann, Sebastian; Bowman, Howard; Hanslmayr, Simon

    2016-01-01

    Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz) power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans. PMID:27494601

  15. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    Science.gov (United States)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  16. Simulation and Experimental Validation of Electromagnetic Signatures for Monitoring of Nuclear Material Storage Containers

    International Nuclear Information System (INIS)

    Aker, Pamela M.; Bunch, Kyle J.; Jones, Anthony M.

    2013-01-01

    Previous research at the Pacific Northwest National Laboratory (PNNL) has demonstrated that the low frequency electromagnetic (EM) response of a sealed metallic container interrogated with an encircling coil is a strong function of its contents and can be used to form a distinct signature which can confirm the presence of specific components without revealing hidden geometry or classified design information. Finite element simulations have recently been performed to further investigate this response for a variety of configurations composed of an encircling coil and a typical nuclear material storage container. Excellent agreement was obtained between simulated and measured impedance signatures of electrically conducting spheres placed inside an AT-400R nuclear container. Simulations were used to determine the effects of excitation frequency and the geometry of the encircling coil, nuclear container, and internal contents. The results show that it is possible to use electromagnetic models to evaluate the application of the EM signature technique to proposed versions of nuclear weapons containers which can accommodate restrictions imposed by international arms control and treaty verification legislation

  17. Signatures of correlated excitonic dynamics in two-dimensional spectroscopy of the Fenna-Matthew-Olson photosynthetic complex

    International Nuclear Information System (INIS)

    Caram, Justin R.; Lewis, Nicholas H. C.; Fidler, Andrew F.; Engel, Gregory S.

    2012-01-01

    Long-lived excitonic coherence in photosynthetic proteins has become an exciting area of research because it may provide design principles for enhancing the efficiency of energy transfer in a broad range of materials. In this publication, we provide new evidence that long-lived excitonic coherence in the Fenna-Mathew-Olson pigment-protein (FMO) complex is consistent with the assumption of cross correlation in the site basis, indicating that each site shares bath fluctuations. We analyze the structure and character of the beating crosspeak between the two lowest energy excitons in two-dimensional (2D) electronic spectra of the FMO Complex. To isolate this dynamic signature, we use the two-dimensional linear prediction Z-transform as a platform for filtering coherent beating signatures within 2D spectra. By separating signals into components in frequency and decay rate representations, we are able to improve resolution and isolate specific coherences. This strategy permits analysis of the shape, position, character, and phase of these features. Simulations of the crosspeak between excitons 1 and 2 in FMO under different regimes of cross correlation verify that statistically independent site fluctuations do not account for the elongation and persistence of the dynamic crosspeak. To reproduce the experimental results, we invoke near complete correlation in the fluctuations experienced by the sites associated with excitons 1 and 2. This model contradicts ab initio quantum mechanic/molecular mechanics simulations that observe no correlation between the energies of individual sites. This contradiction suggests that a new physical model for long-lived coherence may be necessary. The data presented here details experimental results that must be reproduced for a physical model of quantum coherence in photosynthetic energy transfer.

  18. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans.

    Directory of Open Access Journals (Sweden)

    Sebastian Michelmann

    2016-08-01

    Full Text Available Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans.

  19. Impact of seaweed beachings on dynamics of δ(15)N isotopic signatures in marine macroalgae.

    Science.gov (United States)

    Lemesle, Stéphanie; Mussio, Isabelle; Rusig, Anne-Marie; Menet-Nédélec, Florence; Claquin, Pascal

    2015-08-15

    A fine-scale survey of δ(15)N, δ(13)C, tissue-N in seaweeds was conducted using samples from 17 sampling points at two sites (Grandcamp-Maisy (GM), Courseulles/Mer (COU)) along the French coast of the English Channel in 2012 and 2013. Partial triadic analysis was performed on the parameter data sets and revealed the functioning of three areas: one estuary (EstA) and two rocky areas (GM(∗), COU(∗)). In contrast to oceanic and anthropogenic reference points similar temporal dynamics characterized δ(15)N signatures and N contents at GM(∗) and COU(∗). Nutrient dynamics were similar: the N-concentrations in seawater originated from the River Seine and local coastal rivers while P-concentrations mainly from these local rivers. δ(15)N at GM(∗) were linked to turbidity suggesting inputs of autochthonous organic matter from large-scale summer seaweed beachings made up of a mixture of Rhodophyta, Phaeophyta and Chlorophyta species. This study highlights the coupling between seaweed beachings and nitrogen sources of intertidal macroalgae. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Keystroke dynamics in the pre-touchscreen era.

    Science.gov (United States)

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A

    2013-12-19

    Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  1. Keystroke Dynamics in the pre-Touchscreen Era

    Directory of Open Access Journals (Sweden)

    Nasir eAhmad

    2013-12-01

    Full Text Available Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realised via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable, and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilise multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view towards indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  2. Keystroke dynamics in the pre-touchscreen era

    Science.gov (United States)

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.

    2013-01-01

    Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568

  3. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  4. Computer-aided classification of lesions by means of their kinetic signatures in dynamic contrast-enhanced MR images

    Science.gov (United States)

    Twellmann, Thorsten; ter Haar Romeny, Bart

    2008-03-01

    The kinetic characteristics of tissue in dynamic contrast-enhanced magnetic resonance imaging data are an important source of information for the differentiation of benign and malignant lesions. Kinetic curves measured for each lesion voxel allow to infer information about the state of the local tissue. As a whole, they reflect the heterogeneity of the vascular structure within a lesion, an important criterion for the preoperative classification of lesions. Current clinical practice in analysis of tissue kinetics however is mainly based on the evaluation of the "most-suspect curve", which is only related to a small, manually or semi-automatically selected region-of-interest within a lesion and does not reflect any information about tissue heterogeneity. We propose a new method which exploits the full range of kinetic information for the automatic classification of lesions. Instead of breaking down the large amount of kinetic information to a single curve, each lesion is considered as a probability distribution in a space of kinetic features, efficiently represented by its kinetic signature obtained by adaptive vector quantization of the corresponding kinetic curves. Dissimilarity of two signatures can be objectively measured using the Mallows distance, which is a metric defined on probability distributions. The embedding of this metric in a suitable kernel function enables us to employ modern kernel-based machine learning techniques for the classification of signatures. In a study considering 81 breast lesions, the proposed method yielded an A z value of 0.89+/-0.01 for the discrimination of benign and malignant lesions in a nested leave-one-lesion-out evaluation setting.

  5. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  6. Detection of hail signatures from single-polarization C-band radar reflectivity

    Science.gov (United States)

    Kunz, Michael; Kugel, Petra I. S.

    2015-02-01

    Five different criteria that estimate hail signatures from single-polarization radar data are statistically evaluated over a 15-year period by categorical verification against loss data provided by a building insurance company. The criteria consider different levels or thresholds of radar reflectivity, some of them complemented by estimates of the 0 °C level or cloud top temperature. Applied to reflectivity data from a single C-band radar in southwest Germany, it is found that all criteria are able to reproduce most of the past damage-causing hail events. However, the criteria substantially overestimate hail occurrence by up to 80%, mainly due to the verification process using damage data. Best results in terms of highest Heidke Skill Score HSS or Critical Success Index CSI are obtained for the Hail Detection Algorithm (HDA) and the Probability of Severe Hail (POSH). Radar-derived hail probability shows a high spatial variability with a maximum on the lee side of the Black Forest mountains and a minimum in the broad Rhine valley.

  7. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  8. Pen and platen, piezo-electric (21 Aug 1978) (Engineering Materials). [Signature verification

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  9. Dynamical signatures of isometric force control as a function of age, expertise, and task constraints.

    Science.gov (United States)

    Vieluf, Solveig; Sleimen-Malkoun, Rita; Voelcker-Rehage, Claudia; Jirsa, Viktor; Reuter, Eva-Maria; Godde, Ben; Temprado, Jean-Jacques; Huys, Raoul

    2017-07-01

    From the conceptual and methodological framework of the dynamical systems approach, force control results from complex interactions of various subsystems yielding observable behavioral fluctuations, which comprise both deterministic (predictable) and stochastic (noise-like) dynamical components. Here, we investigated these components contributing to the observed variability in force control in groups of participants differing in age and expertise level. To this aim, young (18-25 yr) as well as late middle-aged (55-65 yr) novices and experts (precision mechanics) performed a force maintenance and a force modulation task. Results showed that whereas the amplitude of force variability did not differ across groups in the maintenance tasks, in the modulation task it was higher for late middle-aged novices than for experts and higher for both these groups than for young participants. Within both tasks and for all groups, stochastic fluctuations were lowest where the deterministic influence was smallest. However, although all groups showed similar dynamics underlying force control in the maintenance task, a group effect was found for deterministic and stochastic fluctuations in the modulation task. The latter findings imply that both components were involved in the observed group differences in the variability of force fluctuations in the modulation task. These findings suggest that between groups the general characteristics of the dynamics do not differ in either task and that force control is more affected by age than by expertise. However, expertise seems to counteract some of the age effects. NEW & NOTEWORTHY Stochastic and deterministic dynamical components contribute to force production. Dynamical signatures differ between force maintenance and cyclic force modulation tasks but hardly between age and expertise groups. Differences in both stochastic and deterministic components are associated with group differences in behavioral variability, and observed behavioral

  10. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  11. Delay signatures in the chaotic intensity output of a quantum dot ...

    Indian Academy of Sciences (India)

    journal of. May 2016 physics pp. 1021–1030. Delay signatures in the chaotic intensity output ... Research in complex systems require quantitative predictions of their dynamics, even ... used methods for estimating delay in complex dynamics are autocorrelation function ..... Authors thank BRNS for its financial support.

  12. Impact of seaweed beachings on dynamics of δ15N isotopic signatures in marine macroalgae

    International Nuclear Information System (INIS)

    Lemesle, Stéphanie; Mussio, Isabelle; Rusig, Anne-Marie; Menet-Nédélec, Florence; Claquin, Pascal

    2015-01-01

    Highlights: • Two coastal sites (COU, GM) in the Bay of Seine affected by summer seaweed beachings. • The same temporal dynamics of the algal δ 15 N at the two sites. • N and P concentrations in seawater of the two sites dominated by riverine sources. • A coupling between seaweed beachings and N sources of intertidal macroalgae. - Abstract: A fine-scale survey of δ 15 N, δ 13 C, tissue-N in seaweeds was conducted using samples from 17 sampling points at two sites (Grandcamp-Maisy (GM), Courseulles/Mer (COU)) along the French coast of the English Channel in 2012 and 2013. Partial triadic analysis was performed on the parameter data sets and revealed the functioning of three areas: one estuary (EstA) and two rocky areas (GM ∗ , COU ∗ ). In contrast to oceanic and anthropogenic reference points similar temporal dynamics characterized δ 15 N signatures and N contents at GM ∗ and COU ∗ . Nutrient dynamics were similar: the N-concentrations in seawater originated from the River Seine and local coastal rivers while P-concentrations mainly from these local rivers. δ 15 N at GM ∗ were linked to turbidity suggesting inputs of autochthonous organic matter from large-scale summer seaweed beachings made up of a mixture of Rhodophyta, Phaeophyta and Chlorophyta species. This study highlights the coupling between seaweed beachings and nitrogen sources of intertidal macroalgae

  13. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    Science.gov (United States)

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  14. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    Directory of Open Access Journals (Sweden)

    Omar Tayan

    2014-01-01

    Full Text Available This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints.

  15. Identification of host response signatures of infection.

    Energy Technology Data Exchange (ETDEWEB)

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  16. SecurePhone: a mobile phone with biometric authentication and e-signature support for dealing secure transactions on the fly

    Science.gov (United States)

    Ricci, R.; Chollet, G.; Crispino, M. V.; Jassim, S.; Koreman, J.; Olivar-Dimas, M.; Garcia-Salicetti, S.; Soria-Rodriguez, P.

    2006-05-01

    This article presents an overview of the SecurePhone project, with an account of the first results obtained. SecurePhone's primary aim is to realise a mobile phone prototype - the 'SecurePhone' - in which biometrical authentication enables users to deal secure, dependable transactions over a mobile network. The SecurePhone is based on a commercial PDA-phone, supplemented with specific software modules and a customised SIM card. It integrates in a single environment a number of advanced features: access to cryptographic keys through strong multimodal biometric authentication; appending and verification of digital signatures; real-time exchange and interactive modification of (esigned) documents and voice recordings. SecurePhone's 'biometric recogniser' is based on original research. A fused combination of three different biometric methods - speaker, face and handwritten signature verification - is exploited, with no need for dedicated hardware components. The adoption of non-intrusive, psychologically neutral biometric techniques is expected to mitigate rejection problems that often inhibit the social use of biometrics, and speed up the spread of e-signature technology. Successful biometric authentication grants access to SecurePhone's built-in esignature services through a user-friendly interface. Special emphasis is accorded to the definition of a trustworthy security chain model covering all aspects of system operation. The SecurePhone is expected to boost m-commerce and open new scenarios for m-business and m-work, by changing the way people interact and by improving trust and confidence in information technologies, often considered intimidating and difficult to use. Exploitation plans will also explore other application domains (physical and logical access control, securised mobile communications).

  17. Signature-based User Authentication

    OpenAIRE

    Hámorník, Juraj

    2015-01-01

    This work aims on missing handwritten signature authentication in Windows. Result of this work is standalone software that allow users to log into Windows by writing signature. We focus on security of signature authentification and best overall user experience. We implemented signature authentification service that accept signature and return user access token if signature is genuine. Signature authentification is done by comparing given signature to signature patterns by their similarity. Si...

  18. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  19. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  20. Design And Implementation of Low Area/Power Elliptic Curve Digital Signature Hardware Core

    Directory of Open Access Journals (Sweden)

    Anissa Sghaier

    2017-06-01

    Full Text Available The Elliptic Curve Digital Signature Algorithm(ECDSA is the analog to the Digital Signature Algorithm(DSA. Based on the elliptic curve, which uses a small key compared to the others public-key algorithms, ECDSA is the most suitable scheme for environments where processor power and storage are limited. This paper focuses on the hardware implementation of the ECDSA over elliptic curveswith the 163-bit key length recommended by the NIST (National Institute of Standards and Technology. It offers two services: signature generation and signature verification. The proposed processor integrates an ECC IP, a Secure Hash Standard 2 IP (SHA-2 Ip and Random Number Generator IP (RNG IP. Thus, all IPs will be optimized, and different types of RNG will be implemented in order to choose the most appropriate one. A co-simulation was done to verify the ECDSA processor using MATLAB Software. All modules were implemented on a Xilinx Virtex 5 ML 50 FPGA platform; they require respectively 9670 slices, 2530 slices and 18,504 slices. FPGA implementations represent generally the first step for obtaining faster ASIC implementations. Further, the proposed design was also implemented on an ASIC CMOS 45-nm technology; it requires a 0.257 mm2 area cell achieving a maximum frequency of 532 MHz and consumes 63.444 (mW. Furthermore, in this paper, we analyze the security of our proposed ECDSA processor against the no correctness check for input points and restart attacks.

  1. UTEX modeling of xenon signature sensitivity to geology and explosion cavity characteristics following an underground nuclear explosion

    Science.gov (United States)

    Lowrey, J. D.; Haas, D.

    2013-12-01

    Underground nuclear explosions (UNEs) produce anthropogenic isotopes that can potentially be used in the verification component of the Comprehensive Nuclear-Test-Ban Treaty. Several isotopes of radioactive xenon gas have been identified as radionuclides of interest within the International Monitoring System (IMS) and in an On-Site Inspection (OSI). Substantial research has been previously undertaken to characterize the geologic and atmospheric mechanisms that can drive the movement of radionuclide gas from a well-contained UNE, considering both sensitivities on gas arrival time and signature variability of xenon due to the nature of subsurface transport. This work further considers sensitivities of radioxenon gas arrival time and signatures to large variability in geologic stratification and generalized explosion cavity characteristics, as well as compares this influence to variability in the shallow surface.

  2. Independent Verification Survey Report For Zone 1 Of The East Tennessee Technology Park In Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    King, David A.

    2012-01-01

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs)

  3. Real Traceable Signatures

    Science.gov (United States)

    Chow, Sherman S. M.

    Traceable signature scheme extends a group signature scheme with an enhanced anonymity management mechanism. The group manager can compute a tracing trapdoor which enables anyone to test if a signature is signed by a given misbehaving user, while the only way to do so for group signatures requires revealing the signer of all signatures. Nevertheless, it is not tracing in a strict sense. For all existing schemes, T tracing agents need to recollect all N' signatures ever produced and perform RN' “checks” for R revoked users. This involves a high volume of transfer and computations. Increasing T increases the degree of parallelism for tracing but also the probability of “missing” some signatures in case some of the agents are dishonest.

  4. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  5. Signature Balancing

    NARCIS (Netherlands)

    Noordkamp, H.W.; Brink, M. van den

    2006-01-01

    Signatures are an important part of the design of a ship. In an ideal situation, signatures must be as low as possible. However, due to budget constraints it is most unlikely to reach this ideal situation. The arising question is which levels of signatures are optimal given the different scenarios

  6. Major urinary protein (MUP) profiles show dynamic changes rather than individual 'barcode' signatures.

    Science.gov (United States)

    Thoß, M; Luzynski, K C; Ante, M; Miller, I; Penn, D J

    2015-06-30

    House mice ( Mus musculus) produce a variable number of major urinary proteins (MUPs), and studies suggest that each individual produces a unique MUP profile that provides a distinctive odor signature controlling individual and kin recognition. This 'barcode hypothesis' requires that MUP urinary profiles show high individual variability within populations and also high individual consistency over time, but tests of these assumptions are lacking. We analyzed urinary MUP profiles of 66 wild-caught house mice from eight populations using isoelectric focusing. We found that MUP profiles of wild male house mice are not individually unique, and though they were highly variable, closer inspection revealed that the variation strongly depended on MUP band type. The prominent ('major) bands were surprisingly homogenous (and hence most MUPs are not polymorphic), but we also found inconspicuous ('minor') bands that were highly variable and therefore potential candidates for individual fingerprints. We also examined changes in urinary MUP profiles of 58 males over time (from 6 to 24 weeks of age), and found that individual MUP profiles and MUP concentration were surprisingly dynamic, and showed significant changes after puberty and during adulthood. Contrary to what we expected, however, the minor bands were the most variable over time, thus no good candidates for individual fingerprints. Although MUP profiles do not provide individual fingerprints, we found that MUP profiles were more similar among siblings than non-kin despite considerable fluctuation. Our findings show that MUP profiles are not highly stable over time, they do not show strong individual clustering, and thus challenge the barcode hypothesis. Within-individual dynamics of MUP profiles indicate a different function of MUPs in individual recognition than previously assumed and advocate an alternative hypothesis ('dynamic changes' hypothesis).

  7. Influence of fuel composition on the spent fuel verification by Self‑Interrogation Neutron Resonance Densitometry

    International Nuclear Information System (INIS)

    Rossa, Riccardo; Borella, Alessandro; Van der Meer, Klaas; Labeau, Pierre‑Etienne; Pauly, Nicolas

    2015-01-01

    The Self‑Interrogation Neutron Resonance Densitometry (SINRD) is a passive Non‑Destructive Assay (NDA) that is developed for the safeguards verification of spent nuclear fuel. The main goal of SINRD is the direct quantification of 239Pu by estimating the SINRD signature, which is the ratio between the neutron flux in the fast energy region and in the region close to the 0.3 eV resonance of 239 Pu. The resonance region was chosen because the reduction of the neutron flux within 0.2-0.4 eV is due mainly to neutron absorption from 239 Pu, and therefore the SINRD signature can be correlated to the 239Pu mass in the fuel assembly. This work provides an estimate of the influence of 239 Pu and other nuclides on the SINRD signature. This assessment is performed by Monte Carlo simulations by introducing several nuclides in the fuel material composition and by calculating the SINRD signature for each case. The reference spent fuel library developed by SCK CEN was used for the detailed fuel compositions of PWR 17x17 fuel assemblies with different initial enrichments, burnup, and cooling times. The results from the simulations show that the SINRD signature is mainly correlated to the 239 Pu mass, with significant influence by 235 U. Moreover, the SINRD technique is largely insensitive to the cooling time of the assembly, while it is affected by the burnup and initial enrichment of the fuel. Apart from 239 Pu and 235 U, many other nuclides give minor contributions to the SINRD signature, especially at burnup higher than 20 GWd/tHM.

  8. Distinguishing signatures of determinism and stochasticity in spiking complex systems

    Science.gov (United States)

    Aragoneses, Andrés; Rubido, Nicolás; Tiana-Alsina, Jordi; Torrent, M. C.; Masoller, Cristina

    2013-01-01

    We describe a method to infer signatures of determinism and stochasticity in the sequence of apparently random intensity dropouts emitted by a semiconductor laser with optical feedback. The method uses ordinal time-series analysis to classify experimental data of inter-dropout-intervals (IDIs) in two categories that display statistically significant different features. Despite the apparent randomness of the dropout events, one IDI category is consistent with waiting times in a resting state until noise triggers a dropout, and the other is consistent with dropouts occurring during the return to the resting state, which have a clear deterministic component. The method we describe can be a powerful tool for inferring signatures of determinism in the dynamics of complex systems in noisy environments, at an event-level description of their dynamics.

  9. Aging in biometrics: an experimental analysis on on-line signature.

    Directory of Open Access Journals (Sweden)

    Javier Galbally

    Full Text Available The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system's performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given.

  10. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  11. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  12. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  13. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  14. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  15. Wave function of the Universe, preferred reference frame effects and metric signature transition

    International Nuclear Information System (INIS)

    Ghaffarnejad, Hossein

    2015-01-01

    Gravitational model of non-minimally coupled Brans Dicke (BD) scalar field 0 with dynamical unit time-like four vector field is used to study flat Robertson Walker (RW) cosmology in the presence of variable cosmological parameter V (ϕ) = Λϕ. Aim of the paper is to seek cosmological models which exhibit metric signature transition. The problem is studied in both classical and quantum cosmological approach with large values of BD parameter ω >> 1. Scale factor of RW metric is obtained as which describes nonsingular inflationary universe in Lorentzian signature sector. Euclidean signature sector of our solution describes a re-collapsing universe and is obtained from analytic continuation of the Lorentzian sector by exchanging . Dynamical vector field together with the BD scalar field are treated as fluid with time dependent barotropic index. They have regular (dark) matter dominance in the Euclidean (Lorentzian) sector. We solved Wheeler De Witt (WD) quantum wave equation of the cosmological system. Assuming a discrete non-zero ADM mass we obtained solutions of the WD equation as simple harmonic quantum Oscillator eigen functionals described by Hermite polynomials. Absolute values of these eigen functionals have nonzero values on the hypersurface in which metric field has signature degeneracy. Our eigen functionals describe nonzero probability of the space time with Lorentzian (Euclidean) signature for . Maximal probability corresponds to the ground state j = 0. (paper)

  16. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  17. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  18. Dynamic response signatures of a scaled model platform for floating wind turbines in an ocean wave basin.

    Science.gov (United States)

    Jaksic, V; O'Shea, R; Cahill, P; Murphy, J; Mandic, D P; Pakrashi, V

    2015-02-28

    Understanding of dynamic behaviour of offshore wind floating substructures is extremely important in relation to design, operation, maintenance and management of floating wind farms. This paper presents assessment of nonlinear signatures of dynamic responses of a scaled tension-leg platform (TLP) in a wave tank exposed to different regular wave conditions and sea states characterized by the Bretschneider, the Pierson-Moskowitz and the JONSWAP spectra. Dynamic responses of the TLP were monitored at different locations using load cells, a camera-based motion recognition system and a laser Doppler vibrometer. The analysis of variability of the TLP responses and statistical quantification of their linearity or nonlinearity, as non-destructive means of structural monitoring from the output-only condition, remains a challenging problem. In this study, the delay vector variance (DVV) method is used to statistically study the degree of nonlinearity of measured response signals from a TLP. DVV is observed to create a marker estimating the degree to which a change in signal nonlinearity reflects real-time behaviour of the structure and also to establish the sensitivity of the instruments employed to these changes. The findings can be helpful in establishing monitoring strategies and control strategies for undesirable levels or types of dynamic response and can help to better estimate changes in system characteristics over the life cycle of the structure. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Identification of Signatures to Detect Undeclared Nuclear Activities at the Front-end of the Fuel Cycle

    International Nuclear Information System (INIS)

    Varga, Z.; Mayer, K.; Krajko, J.; Ho, D.; Wallenius, M.; )

    2015-01-01

    Several parameters of the nuclear materials can be used to verify their sources and the declared origin for safeguards purposes, such as chemical composition, nuclear material content, impurities or the isotopic compositions of major or trace-level constituents. Combining these parameters (also known as signatures) enables the verification of the safeguarded materials at high confidence, and also allows detecting the use of undeclared nuclear materials. Moreover, several signatures can be used not only as a comparative indicator against another samples or datasets, but also permits to reveal the possible origin of the undeclared feed material without any prior knowledge on the provenance. The measurable signatures, however, have different strength and require diverse analytical techniques, thus the knowledge of their variations throughout the complex production processes is of vital importance to use them for safeguards. The aim of the present study is to investigate the behaviour and relevance of as many signatures for safeguards as possible in a respective uranium ore concentrate production process. Within the framework of the European Commission Support Programme A 1753 the production of uranium ore concentrate from uranium ore was followed and sampled at each stage. By the comprehensive analysis of the samples (major and minor constituents, molecular structure, morphology, rare-earth elemental pattern, trace-level organic residues, age measurement, isotopic study of S, Pb, Sr, Nd and Th), together with the process information, the role and applicability of the various signatures can be assessed. By this means the appropriate and relevant safeguards parameters can be identified, their advantages and limitations can be revealed. (author)

  20. STUDIES OF ACOUSTIC EMISSION SIGNATURES FOR QUALITY ASSURANCE OF SS 316L WELDED SAMPLES UNDER DYNAMIC LOAD CONDITIONS

    Directory of Open Access Journals (Sweden)

    S. V. RANGANAYAKULU

    2016-10-01

    Full Text Available Acoustic Emission (AE signatures of various weld defects of stainless steel 316L nuclear grade weld material are investigated. The samples are fabricated by Tungsten Inert Gas (TIG Welding Method have final dimension of 140 mm x 15 mm x 10 mm. AE signals from weld defects such as Pinhole, Porosity, Lack of Penetration, Lack of Side Fusion and Slag are recorded under dynamic load conditions by specially designed mechanical jig. AE features of the weld defects were attained using Linear Location Technique (LLT. The results from this study concluded that, stress release and structure deformation between the sections in welding area are load conditions major part of Acoustic Emission activity during loading.

  1. SIGNATURE: A workbench for gene expression signature analysis

    Directory of Open Access Journals (Sweden)

    Chang Jeffrey T

    2011-11-01

    Full Text Available Abstract Background The biological phenotype of a cell, such as a characteristic visual image or behavior, reflects activities derived from the expression of collections of genes. As such, an ability to measure the expression of these genes provides an opportunity to develop more precise and varied sets of phenotypes. However, to use this approach requires computational methods that are difficult to implement and apply, and thus there is a critical need for intelligent software tools that can reduce the technical burden of the analysis. Tools for gene expression analyses are unusually difficult to implement in a user-friendly way because their application requires a combination of biological data curation, statistical computational methods, and database expertise. Results We have developed SIGNATURE, a web-based resource that simplifies gene expression signature analysis by providing software, data, and protocols to perform the analysis successfully. This resource uses Bayesian methods for processing gene expression data coupled with a curated database of gene expression signatures, all carried out within a GenePattern web interface for easy use and access. Conclusions SIGNATURE is available for public use at http://genepattern.genome.duke.edu/signature/.

  2. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  3. Major urinary protein (MUP) profiles show dynamic changes rather than individual ‘barcode’ signatures

    Science.gov (United States)

    Thoß, M.; Luzynski, K.C.; Ante, M.; Miller, I.; Penn, D.J.

    2016-01-01

    House mice (Mus musculus) produce a variable number of major urinary proteins (MUPs), and studies suggest that each individual produces a unique MUP profile that provides a distinctive odor signature controlling individual and kin recognition. This ‘barcode hypothesis’ requires that MUP urinary profiles show high individual variability within populations and also high individual consistency over time, but tests of these assumptions are lacking. We analyzed urinary MUP profiles of 66 wild-caught house mice from eight populations using isoelectric focusing. We found that MUP profiles of wild male house mice are not individually unique, and though they were highly variable, closer inspection revealed that the variation strongly depended on MUP band type. The prominent (‘major) bands were surprisingly homogenous (and hence most MUPs are not polymorphic), but we also found inconspicuous (‘minor’) bands that were highly variable and therefore potential candidates for individual fingerprints. We also examined changes in urinary MUP profiles of 58 males over time (from 6 to 24 weeks of age), and found that individual MUP profiles and MUP concentration were surprisingly dynamic, and showed significant changes after puberty and during adulthood. Contrary to what we expected, however, the minor bands were the most variable over time, thus no good candidates for individual fingerprints. Although MUP profiles do not provide individual fingerprints, we found that MUP profiles were more similar among siblings than non-kin despite considerable fluctuation. Our findings show that MUP profiles are not highly stable over time, they do not show strong individual clustering, and thus challenge the barcode hypothesis. Within-individual dynamics of MUP profiles indicate a different function of MUPs in individual recognition than previously assumed and advocate an alternative hypothesis (‘dynamic changes’ hypothesis). PMID:26973837

  4. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  5. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  6. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  7. Dosimetric properties of an amorphous silicon electronic portal imaging device for verification of dynamic intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Greer, Peter B.; Popescu, Carmen C.

    2003-01-01

    Dosimetric properties of an amorphous silicon electronic portal imaging device (EPID) for verification of dynamic intensity modulated radiation therapy (IMRT) delivery were investigated. The EPID was utilized with continuous frame-averaging during the beam delivery. Properties studied included effect of buildup, dose linearity, field size response, sampling of rapid multileaf collimator (MLC) leaf speeds, response to dose-rate fluctuations, memory effect, and reproducibility. The dependence of response on EPID calibration and a dead time in image frame acquisition occurring every 64 frames were measured. EPID measurements were also compared to ion chamber and film for open and wedged static fields and IMRT fields. The EPID was linear with dose and dose rate, and response to MLC leaf speeds up to 2.5 cm s-1 was found to be linear. A field size dependent response of up to 5% relative to d max ion-chamber measurement was found. Reproducibility was within 0.8% (1 standard deviation) for an IMRT delivery recorded at intervals over a period of one month. The dead time in frame acquisition resulted in errors in the EPID that increased with leaf speed and were over 20% for a 1 cm leaf gap moving at 1.0 cm s-1. The EPID measurements were also found to depend on the input beam profile utilized for EPID flood-field calibration. The EPID shows promise as a device for verification of IMRT, the major limitation currently being due to dead-time in frame acquisition

  8. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  9. Cosmological transitions with changes in the signature of the metric

    International Nuclear Information System (INIS)

    Sakharov, A.D.

    1984-01-01

    It is conjectured that there exist states of the physical continuum which include regions with different signatures of the metric and that the observed Universe and an infinite number of other Universes arose as a result of quantum transitions with a change in the signature of the metric. The Lagrangian in such a theory must satisfy conditions of non-negativity in the regions with even signature. Signature here means the number of time coordinates. The induced gravitational Lagrangian in a conformally invariant theory of Kaluza-Klein type evidently satisfies this requirement and leads to effective equations of the gravitational theory of macroscopic space identical to the equations of the general theory of relativity. It is suggested that in our Universe there exist in addition to the observable (macroscopic) time dimension two or some other even number of compactified time dimensions. It is suggested that the formation of a Euclidean region in the center of a black hole or in the cosmological contraction of the Universe (if it is predetermined by the dynamics) is a possible outcome of gravitational collapse

  10. Nitrate denitrification with nitrite or nitrous oxide as intermediate products: Stoichiometry, kinetics and dynamics of stable isotope signatures.

    Science.gov (United States)

    Vavilin, V A; Rytov, S V

    2015-09-01

    A kinetic analysis of nitrate denitrification by a single or two species of denitrifying bacteria with glucose or ethanol as a carbon source and nitrite or nitrous oxide as intermediate products was performed using experimental data published earlier (Menyailo and Hungate, 2006; Vidal-Gavilan et al., 2013). Modified Monod kinetics was used in the dynamic biological model. The special equations were added to the common dynamic biological model to describe how isotopic fractionation between N species changes. In contrast to the generally assumed first-order kinetics, in this paper, the traditional Rayleigh equation describing stable nitrogen and oxygen isotope fractionation in nitrate was derived from the dynamic isotopic equations for any type of kinetics. In accordance with the model, in Vidal-Gavilan's experiments, the maximum specific rate of nitrate reduction was proved to be less for ethanol compared to glucose. Conversely, the maximum specific rate of nitrite reduction was proved to be much less for glucose compared to ethanol. Thus, the intermediate nitrite concentration was negligible for the ethanol experiment, while it was significant for the glucose experiment. In Menyailo's and Hungate's experiments, the low value of maximum specific rate of nitrous oxide reduction gives high intermediate value of nitrous oxide concentration. The model showed that the dynamics of nitrogen and oxygen isotope signatures are responding to the biological dynamics. Two microbial species instead of single denitrifying bacteria are proved to be more adequate to describe the total process of nitrate denitrification to dinitrogen. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  12. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  13. Detection of Damage in Operating Wind Turbines by Signature Distances

    Directory of Open Access Journals (Sweden)

    James F. Manwell

    2013-01-01

    Full Text Available Wind turbines operate in the atmospheric boundary layer and are subject to complex random loading. This precludes using a deterministic response of healthy turbines as the baseline for identifying the effect of damage on the measured response of operating turbines. In the absence of such a deterministic response, the stochastic dynamic response of the tower to a shutdown maneuver is found to be affected distinctively by damage in contrast to wind. Such a dynamic response, however, cannot be established for the blades. As an alternative, the estimate of blade damage is sought through its effect on the third or fourth modal frequency, each found to be mostly unaffected by wind. To discern the effect of damage from the wind effect on these responses, a unified method of damage detection is introduced that accommodates different responses. In this method, the dynamic responses are transformed to surfaces via continuous wavelet transforms to accentuate the effect of wind or damage on the dynamic response. Regions of significant deviations between these surfaces are then isolated in their corresponding planes to capture the change signatures. The image distances between these change signatures are shown to produce consistent estimates of damage for both the tower and the blades in presence of varying wind field profiles.

  14. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  15. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  16. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  17. Pen and platen, piezo-electric (Engineering Materials). [Signature verification for access to restricted areas

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  18. Unconditionally Secure Quantum Signatures

    Directory of Open Access Journals (Sweden)

    Ryan Amiri

    2015-08-01

    Full Text Available Signature schemes, proposed in 1976 by Diffie and Hellman, have become ubiquitous across modern communications. They allow for the exchange of messages from one sender to multiple recipients, with the guarantees that messages cannot be forged or tampered with and that messages also can be forwarded from one recipient to another without compromising their validity. Signatures are different from, but no less important than encryption, which ensures the privacy of a message. Commonly used signature protocols—signatures based on the Rivest–Adleman–Shamir (RSA algorithm, the digital signature algorithm (DSA, and the elliptic curve digital signature algorithm (ECDSA—are only computationally secure, similar to public key encryption methods. In fact, since these rely on the difficulty of finding discrete logarithms or factoring large primes, it is known that they will become completely insecure with the emergence of quantum computers. We may therefore see a shift towards signature protocols that will remain secure even in a post-quantum world. Ideally, such schemes would provide unconditional or information-theoretic security. In this paper, we aim to provide an accessible and comprehensive review of existing unconditionally securesecure signature schemes for signing classical messages, with a focus on unconditionally secure quantum signature schemes.

  19. Incipient-signature identification of mechanical anomalies in a ship-borne satellite antenna system using an ensemble multiwavelet

    International Nuclear Information System (INIS)

    He, Shuilong; Zi, Yanyang; Chen, Jinglong; Chen, Binqiang; He, Zhengjia; Zhao, Chenlu; Yuan, Jing

    2014-01-01

    The instrumented tracking and telemetry ship with a ship-borne satellite antenna (SSA) is the critical device to ensure high quality of space exploration work. To effectively detect mechanical anomalies that can lead to unexpected downtime of the SSA, an ensemble multiwavelet (EM) is presented for identifying the anomaly related incipient-signatures within the measured dynamic signals. Rather than using a predetermined basis as in a conventional multiwavelet, an EM optimizes the matching basis which satisfactorily adapts to the anomaly related incipient-signatures. The construction technique of an EM is based on the conjunction of a two-scale similarity transform (TST) and lifting scheme (LS). For the technique above, the TST improves the regularity by increasing the approximation order of multiscaling functions, while subsequently the LS enhances the smoothness and localizability via utilizing the vanishing moment of multiwavelet functions. Moreover, combining the Hilbert transform with EM decomposition, we identify the incipient-signatures induced by the mechanical anomalies from the measured dynamic signals. A numerical simulation and two successful applications of diagnosis cases (a planetary gearbox and a roller bearing) demonstrate that the proposed technique is capable of dealing with the challenging incipient-signature identification task even though spectral complexity, as well as the strong amplitude/frequency modulation effect, is present in the dynamic signals. (paper)

  20. Signatures of quantum radiation reaction in laser-electron-beam collisions

    International Nuclear Information System (INIS)

    Wang, H. Y.; Yan, X. Q.; Zepf, M.

    2015-01-01

    Electron dynamics in the collision of an electron beam with a high-intensity focused ultrashort laser pulse are investigated using three-dimensional QED particle-in-cell (PIC) simulations, and the results are compared with those calculated by classical Landau and Lifshitz PIC simulations. Significant differences are observed from the angular dependence of the electron energy distribution patterns for the two different approaches, because photon emission is no longer well approximated by a continuous process in the quantum radiation-dominated regime. The stochastic nature of photon emission results in strong signatures of quantum radiation-reaction effects under certain conditions. We show that the laser spot size and duration greatly influence these signatures due to the competition of QED effects and the ponderomotive force, which is well described in the classical approximation. The clearest signatures of quantum radiation reaction are found in the limit of large laser spots and few cycle pulse durations

  1. Dynamic signature of molecular association in methanol

    International Nuclear Information System (INIS)

    Bertrand, C. E.; Copley, J. R. D.; Faraone, A.; Self, J. L.

    2016-01-01

    Quasielastic neutron scattering measurements and molecular dynamics simulations were combined to investigate the collective dynamics of deuterated methanol, CD 3 OD. In the experimentally determined dynamic structure factor, a slow, non-Fickian mode was observed in addition to the standard density-fluctuation heat mode. The simulation results indicate that the slow dynamical process originates from the hydrogen bonding of methanol molecules. The qualitative behavior of this mode is similar to the previously observed α-relaxation in supercooled water [M. C. Bellissent-Funel et al., Phys. Rev. Lett. 85, 3644 (2000)] which also originates from the formation and dissolution of hydrogen-bonded associates (supramolecular clusters). In methanol, however, this mode is distinguishable well above the freezing transition. This finding indicates that an emergent slow mode is not unique to supercooled water, but may instead be a general feature of hydrogen-bonding liquids and associating molecular liquids.

  2. Assuring image authenticity within a data grid using lossless digital signature embedding and a HIPAA-compliant auditing system

    Science.gov (United States)

    Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.

    2008-03-01

    A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.

  3. Physical description of nuclear materials identification system (NMIS) signatures

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Mullens, J.A.; Mattingly, J.K.; Valentine, T.E.

    2000-01-01

    This paper describes all time and frequency analysis parameters measured with a new correlation processor (capability up to 1 GHz sampling rates and up to five input data channels) for three input channels: (1) the 252 Cf source ionization chamber; (2) a detection channel; and (3) a second detection channel. An intuitive and physical description of the various measured quantities is given as well as a brief mathematical description and a brief description of how the data are acquired. If the full five-channel capability is used, the number of measured quantities increases in number but not in type. The parameters provided by this new processor can be divided into two general classes: time analysis signatures and their related frequency analysis signatures. The time analysis signatures include the number of time m pulses occurs in a time interval, that is triggered randomly, upon a detection event, or upon a source fission event triggered. From the number of pulses in a time interval, the moments, factorial moments, and Feynmann variance can be obtained. Recent implementations of third- and fourth-order time and frequency analysis signatures in this processor are also briefly described. Thus, this processor used with a timed source of input neutrons contains all of the information from a pulsed neutron measurement, one and two detector Rossi-α measurements, multiplicity measurements, and third- and fourth-order correlation functions. This processor, although originally designed for active measurements with a 252 Cf interrogating source, has been successfully used passively (without 252 Cf source) for systems with inherent neutron sources such as fissile systems of plutonium. Data from active measurements with an 18.75 kg highly enriched uranium (93.2 wt%, 235 U) metal casting for storage are presented to illustrate some of the various time and frequency analysis parameters. This processor, which is a five-channel time correlation analyzer with time channel widths

  4. Abstraction of Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer

    2011-01-01

    To enable formal verification of a dynamical system, given by a set of differential equations, it is abstracted by a finite state model. This allows for application of methods for model checking. Consequently, it opens the possibility of carrying out the verification of reachability and timing re...

  5. Verification of the ECMWF ensemble forecasts of wind speed against analyses and observations

    DEFF Research Database (Denmark)

    Pinson, Pierre; Hagedorn, Renate

    2012-01-01

    A framework for the verification of ensemble forecasts of near-surface wind speed is described. It is based on existing scores and diagnostic tools, though considering observations from synoptic stations as reference instead of the analysis. This approach is motivated by the idea of having a user......-oriented view of verification, for instance with the wind power applications in mind. The verification framework is specifically applied to the case of ECMWF ensemble forecasts and over Europe. Dynamic climatologies are derived at the various stations, serving as a benchmark. The impact of observational...... uncertainty on scores and diagnostic tools is also considered. The interest of this framework is demonstrated from its application to the routine evaluation of ensemble forecasts and to the assessment of the quality improvements brought in by the recent change in horizontal resolution of the ECMWF ensemble...

  6. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  7. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  8. Radiation signatures

    International Nuclear Information System (INIS)

    McGlynn, S.P.; Varma, M.N.

    1992-01-01

    A new concept for modelling radiation risk is proposed. This concept is based on the proposal that the spectrum of molecular lesions, which we dub ''the radiation signature'', can be used to identify the quality of the causal radiation. If the proposal concerning radiation signatures can be established then, in principle, both prospective and retrospective risk determination can be assessed on an individual basis. A major goal of biophysical modelling is to relate physical events such as ionization, excitation, etc. to the production of radiation carcinogenesis. A description of the physical events is provided by track structure. The track structure is determined by radiation quality, and it can be considered to be the ''physical signature'' of the radiation. Unfortunately, the uniqueness characteristics of this signature are dissipated in biological systems in ∼10 -9 s. Nonetheless, it is our contention that this physical disturbance of the biological system eventuates later, at ∼10 0 s, in molecular lesion spectra which also characterize the causal radiation. (author)

  9. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy; Jonkman, Jason; Hall, Matthew

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  10. Trans-Planckian physics and signature change events in Bose gas hydrodynamics

    International Nuclear Information System (INIS)

    Weinfurtner, Silke; White, Angela; Visser, Matt

    2007-01-01

    We present an example of emergent spacetime as the hydrodynamic limit of a more fundamental microscopic theory. The low-energy, long-wavelength limit in our model is dominated by collective variables that generate an effective Lorentzian metric. This system naturally exhibits a microscopic mechanism allowing us to perform controlled signature change between Lorentzian and Riemannian geometries. We calculate the number of quasiparticles produced from a finite-duration Euclidean-signature event, where we take the position that to a good approximation the dynamics is dominated by the evolution of the linearized perturbations, as suggested by Calzetta and Hu [Phys. Rev. A 68, 043625 (2003)]. We adapt the ideas presented by Dray et al. [Gen. Relativ. Gravit. 23, 967 (1991)], such that the field and its canonical momentum are continuous at the signature-change event. We investigate the interplay between the underlying microscopic structure and the emergent gravitational field, focussing on its impact on quasiparticle production in the ultraviolet regime. In general, this can be thought of as the combination of trans-Planckian physics and signature-change physics. Further we investigate the possibility of using the proposed signature-change event as an amplifier for analogue 'cosmological particle production' in condensed matter experiments

  11. Petroleum Pumps’ Current and Vibration Signatures Analysis Using Wavelet Coherence Technique

    Directory of Open Access Journals (Sweden)

    Rmdan Shnibha

    2013-01-01

    Full Text Available Vibration analysis is widely used for rotating machinery diagnostics; however measuring vibration of operational oil well pumps is not possible. The pump’s driver’s current signatures may provide condition-related information without the need for an access to the pump itself. This paper investigates the degree of relationship between the pump’s driver’s current signatures and its induced vibration. This relationship between the driver’s current signatures (DCS and its vibration signatures (DVS is studied by calculating magnitude-squared coherence and phase coherence parameters at a certain frequency band using continuous wavelet transform (CWT. The CWT coherence-based technique allows better analysis of temporal evolution of the frequency content of dynamic signals and areas in the time-frequency plane where the two signals exhibit common power or consistent phase behaviour indicating a relationship between the signals. This novel approach is validated by experimental data acquired from 3 kW petroleum pump’s driver. Both vibration and current signatures were acquired under different speed and load conditions. The outcomes of this research suggest the use of DCS analysis as reliable and inexpensive condition monitoring tool, which could be implemented for oil pumps, real-time monitoring associated with condition-based maintenance (CBM program.

  12. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  13. Dynamic thermal signature prediction for real-time scene generation

    Science.gov (United States)

    Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.; Swierkowski, Leszek

    2013-05-01

    At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.

  14. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  15. Electronic Signature Policy

    Science.gov (United States)

    Establishes the United States Environmental Protection Agency's approach to adopting electronic signature technology and best practices to ensure electronic signatures applied to official Agency documents are legally valid and enforceable

  16. Blinding for unanticipated signatures

    NARCIS (Netherlands)

    D. Chaum (David)

    1987-01-01

    textabstractPreviously known blind signature systems require an amount of computation at least proportional to the number of signature types, and also that the number of such types be fixed in advance. These requirements are not practical in some applications. Here, a new blind signature technique

  17. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  18. Dynamical Signatures of Living Systems

    Science.gov (United States)

    Zak, M.

    1999-01-01

    One of the main challenges in modeling living systems is to distinguish a random walk of physical origin (for instance, Brownian motions) from those of biological origin and that will constitute the starting point of the proposed approach. As conjectured, the biological random walk must be nonlinear. Indeed, any stochastic Markov process can be described by linear Fokker-Planck equation (or its discretized version), only that type of process has been observed in the inanimate world. However, all such processes always converge to a stable (ergodic or periodic) state, i.e., to the states of a lower complexity and high entropy. At the same time, the evolution of living systems directed toward a higher level of complexity if complexity is associated with a number of structural variations. The simplest way to mimic such a tendency is to incorporate a nonlinearity into the random walk; then the probability evolution will attain the features of diffusion equation: the formation and dissipation of shock waves initiated by small shallow wave disturbances. As a result, the evolution never "dies:" it produces new different configurations which are accompanied by an increase or decrease of entropy (the decrease takes place during formation of shock waves, the increase-during their dissipation). In other words, the evolution can be directed "against the second law of thermodynamics" by forming patterns outside of equilibrium in the probability space. Due to that, a specie is not locked up in a certain pattern of behavior: it still can perform a variety of motions, and only the statistics of these motions is constrained by this pattern. It should be emphasized that such a "twist" is based upon the concept of reflection, i.e., the existence of the self-image (adopted from psychology). The model consists of a generator of stochastic processes which represents the motor dynamics in the form of nonlinear random walks, and a simulator of the nonlinear version of the diffusion

  19. 1 CFR 18.7 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Signature. 18.7 Section 18.7 General Provisions... PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.7 Signature. The original and each duplicate original... stamped beneath the signature. Initialed or impressed signatures will not be accepted. Documents submitted...

  20. Attribute-Based Digital Signature System

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, Muhammad; Petkovic, M.

    2011-01-01

    An attribute-based digital signature system comprises a signature generation unit (1) for signing a message (m) by generating a signature (s) based on a user secret key (SK) associated with a set of user attributes, wherein the signature generation unit (1) is arranged for combining the user secret

  1. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  2. A New Adaptive Structural Signature for Symbol Recognition by Using a Galois Lattice as a Classifier.

    Science.gov (United States)

    Coustaty, M; Bertet, K; Visani, M; Ogier, J

    2011-08-01

    In this paper, we propose a new approach for symbol recognition using structural signatures and a Galois lattice as a classifier. The structural signatures are based on topological graphs computed from segments which are extracted from the symbol images by using an adapted Hough transform. These structural signatures-that can be seen as dynamic paths which carry high-level information-are robust toward various transformations. They are classified by using a Galois lattice as a classifier. The performance of the proposed approach is evaluated based on the GREC'03 symbol database, and the experimental results we obtain are encouraging.

  3. Verification Image of The Veins on The Back Palm with Modified Local Line Binary Pattern (MLLBP) and Histogram

    Science.gov (United States)

    Prijono, Agus; Darmawan Hangkawidjaja, Aan; Ratnadewi; Saleh Ahmar, Ansari

    2018-01-01

    The verification to person who is used today as a fingerprint, signature, personal identification number (PIN) in the bank system, identity cards, attendance, easily copied and forged. This causes the system not secure and is vulnerable to unauthorized persons to access the system. In this research will be implemented verification system using the image of the blood vessels in the back of the palms as recognition more difficult to imitate because it is located inside the human body so it is safer to use. The blood vessels located at the back of the human hand is unique, even humans twins have a different image of the blood vessels. Besides the image of the blood vessels do not depend on a person’s age, so it can be used for long term, except in the case of an accident, or disease. Because of the unique vein pattern recognition can be used in a person. In this paper, we used a modification method to perform the introduction of a person based on the image of the blood vessel that is using Modified Local Line Binary Pattern (MLLBP). The process of matching blood vessel image feature extraction using Hamming Distance. Test case of verification is done by calculating the percentage of acceptance of the same person. Rejection error occurs if a person was not matched by the system with the data itself. The 10 person with 15 image compared to 5 image vein for each person is resulted 80,67% successful Another test case of the verification is done by verified two image from different person that is forgery, and the verification will be true if the system can rejection the image forgery. The ten different person is not verified and the result is obtained 94%.

  4. Fair quantum blind signatures

    International Nuclear Information System (INIS)

    Tian-Yin, Wang; Qiao-Yan, Wen

    2010-01-01

    We present a new fair blind signature scheme based on the fundamental properties of quantum mechanics. In addition, we analyse the security of this scheme, and show that it is not possible to forge valid blind signatures. Moreover, comparisons between this scheme and public key blind signature schemes are also discussed. (general)

  5. Five Guidelines for Selecting Hydrological Signatures

    Science.gov (United States)

    McMillan, H. K.; Westerberg, I.; Branger, F.

    2017-12-01

    Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to

  6. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  7. Characterization of the non-uniqueness of used nuclear fuel burnup signatures through a Mesh-Adaptive Direct Search

    Energy Technology Data Exchange (ETDEWEB)

    Skutnik, Steven E., E-mail: sskutnik@utk.edu; Davis, David R.

    2016-05-01

    The use of passive gamma and neutron signatures from fission indicators is a common means of estimating used fuel burnup, enrichment, and cooling time. However, while characteristic fission product signatures such as {sup 134}Cs, {sup 137}Cs, {sup 154}Eu, and others are generally reliable estimators for used fuel burnup within the context where the assembly initial enrichment and the discharge time are known, in the absence of initial enrichment and/or cooling time information (such as when applying NDA measurements in a safeguards/verification context), these fission product indicators no longer yield a unique solution for assembly enrichment, burnup, and cooling time after discharge. Through the use of a new Mesh-Adaptive Direct Search (MADS) algorithm, it is possible to directly probe the shape of this “degeneracy space” characteristic of individual nuclides (and combinations thereof), both as a function of constrained parameters (such as the assembly irradiation history) and unconstrained parameters (e.g., the cooling time before measurement and the measurement precision for particular indicator nuclides). In doing so, this affords the identification of potential means of narrowing the uncertainty space of potential assembly enrichment, burnup, and cooling time combinations, thereby bounding estimates of assembly plutonium content. In particular, combinations of gamma-emitting nuclides with distinct half-lives (e.g., {sup 134}Cs with {sup 137}Cs and {sup 154}Eu) in conjunction with gross neutron counting (via {sup 244}Cm) are able to reasonably constrain the degeneracy space of possible solutions to a space small enough to perform useful discrimination and verification of fuel assemblies based on their irradiation history.

  8. Biological signatures of dynamic river networks from a coupled landscape evolution and neutral community model

    Science.gov (United States)

    Stokes, M.; Perron, J. T.

    2017-12-01

    Freshwater systems host exceptionally species-rich communities whose spatial structure is dictated by the topology of the river networks they inhabit. Over geologic time, river networks are dynamic; drainage basins shrink and grow, and river capture establishes new connections between previously separated regions. It has been hypothesized that these changes in river network structure influence the evolution of life by exchanging and isolating species, perhaps boosting biodiversity in the process. However, no general model exists to predict the evolutionary consequences of landscape change. We couple a neutral community model of freshwater organisms to a landscape evolution model in which the river network undergoes drainage divide migration and repeated river capture. Neutral community models are macro-ecological models that include stochastic speciation and dispersal to produce realistic patterns of biodiversity. We explore the consequences of three modes of speciation - point mutation, time-protracted, and vicariant (geographic) speciation - by tracking patterns of diversity in time and comparing the final result to an equilibrium solution of the neutral model on the final landscape. Under point mutation, a simple model of stochastic and instantaneous speciation, the results are identical to the equilibrium solution and indicate the dominance of the species-area relationship in forming patterns of diversity. The number of species in a basin is proportional to its area, and regional species richness reaches its maximum when drainage area is evenly distributed among sub-basins. Time-protracted speciation is also modeled as a stochastic process, but in order to produce more realistic rates of diversification, speciation is not assumed to be instantaneous. Rather, each new species must persist for a certain amount of time before it is considered to be established. When vicariance (geographic speciation) is included, there is a transient signature of increased

  9. Different collectivity in the two signatures of the i13/2 stemming band in 167Yb

    International Nuclear Information System (INIS)

    Petkov, P; Gladnishki, K A; Dewald, A; Fransen, C; Hackstein, M; Jolie, J; Pissulla, Th; Rother, W; Zell, K O; Möller, O; Reese, M; Deloncle, I

    2014-01-01

    Six lifetimes have been determined in the 5/2 + [642] band from vi 13/2 parentage in 167 Yb by means of Recoil distance Doppler-shift (RDDS) measurements carried out at the Cologne FN tandem. The deduced transition strengths and the level scheme are reasonably described by Particle plus triaxial rotor model (PTRM) calculations except for the behavior of the quadrupole collectivity in the two signatures of the 5/2 + [642] band. In that band, the quadrupole collectivity of the favored signature is appreciably larger than this of the unfavored signature. The effect increases with increasing the spin. Naturally, the rigid PTRM cannot explain these features, but the structure of its wave functions suggests a possible solution. It is associated with the enhanced contribution of low-Ω orbitals from vi 13/2 parentage in the favored signature compared to the unfavored one. This could selectively increase the deformation of the favored signature band members and give rise to a dynamic shape coexistence taking place between the two signatures which needs quantitative explanation by future theoretical work.

  10. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  11. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  12. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  13. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  14. Hybrid Control and Verification of a Pulsed Welding Process

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Larsen, Jesper Abildgaard; Izadi-Zamanabadi, Roozbeh

    Currently systems, which are desired to control, are becoming more and more complex and classical control theory objectives, such as stability or sensitivity, are often not sufficient to cover the control objectives of the systems. In this paper it is shown how the dynamics of a pulsed welding...... process can be reformulated into a timed automaton hybrid setting and subsequently properties such as reachability and deadlock absence is verified by the simulation and verification tool UPPAAL....

  15. Analysis of Radar Doppler Signature from Human Data

    Directory of Open Access Journals (Sweden)

    M. ANDRIĆ

    2014-04-01

    Full Text Available This paper presents the results of time (autocorrelation and time-frequency (spectrogram analyses of radar signals returned from the moving human targets. When a radar signal falls on the human target which is moving toward or away from the radar, the signals reflected from different parts of his body produce a Doppler shift that is proportional to the velocity of those parts. Moving parts of the body causes the characteristic Doppler signature. The main contribution comes from the torso which causes the central Doppler frequency of target. The motion of arms and legs induces modulation on the returned radar signal and generates sidebands around the central Doppler frequency, referred to as micro-Doppler signatures. Through analyses on experimental data it was demonstrated that the human motion signature extraction is better using spectrogram. While the central Doppler frequency can be determined using the autocorrelation and the spectrogram, the extraction of the fundamental cadence frequency using the autocorrelation is unreliable when the target is in the clutter presence. It was shown that the fundamental cadence frequency increases with increasing dynamic movement of people and simultaneously the possibility of its extraction is proportional to the degree of synchronization movements of persons in the group.

  16. EG-07CELL CYCLE SIGNATURE AND TUMOR PHYLOGENY ARE ENCODED IN THE EVOLUTIONARY DYNAMICS OF DNA METHYLATION IN GLIOMA

    Science.gov (United States)

    Mazor, Tali; Pankov, Aleksandr; Johnson, Brett E.; Hong, Chibo; Bell, Robert J.A.; Smirnov, Ivan V.; Reis, Gerald F.; Phillips, Joanna J.; Barnes, Michael; Bollen, Andrew W.; Taylor, Barry S.; Molinaro, Annette M.; Olshen, Adam B.; Song, Jun S.; Berger, Mitchel S.; Chang, Susan M.; Costello, Joseph F.

    2014-01-01

    The clonal evolution of tumor cell populations can be reconstructed from patterns of genetic alterations. In contrast, tumor epigenetic states, including DNA methylation, are reversible and sensitive to the tumor microenvironment, presumably precluding the use of epigenetics to discover tumor phylogeny. Here we examined the spatial and temporal dynamics of DNA methylation in a clinically and genetically characterized cohort of IDH1-mutant low-grade gliomas and their patient-matched recurrences. WHO grade II gliomas are diffuse, infiltrative tumors that frequently recur and may undergo malignant progression to a higher grade with a worse prognosis. The extent to which epigenetic alterations contribute to the evolution of low-grade gliomas, including malignant progression, is unknown. While all gliomas in the cohort exhibited the hypermethylation signature associated with IDH1 mutation, low-grade gliomas that underwent malignant progression to high-grade glioblastoma (GBM) had a unique signature of DNA hypomethylation enriched for active enhancers, as well as sites of age-related hypermethylation in the brain. Genes with promoter hypomethylation and concordant transcriptional upregulation during evolution to GBM were enriched in cell cycle function, evolving in concert with genetic alterations that deregulate the G1/S cell cycle checkpoint. Despite the plasticity of tumor epigenetic states, phyloepigenetic trees robustly recapitulated phylogenetic trees derived from somatic mutations in the same patients. These findings highlight widespread co-dependency of genetic and epigenetic events throughout the clonal evolution of initial and recurrent glioma.

  17. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    International Nuclear Information System (INIS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-01-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme. (paper)

  18. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  19. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  20. Quantitative Evaluation of Serum Proteins Uncovers a Protein Signature Related to Maturity-Onset Diabetes of the Young (MODY).

    Science.gov (United States)

    Tuerxunyiming, Muhadasi; Xian, Feng; Zi, Jin; Yimamu, Yilihamujiang; Abuduwayite, Reshalaiti; Ren, Yan; Li, Qidan; Abudula, Abulizi; Liu, SiQi; Mohemaiti, Patamu

    2018-01-05

    Maturity-onset diabetes of the young (MODY) is an inherited monogenic type of diabetes. Genetic mutations in MODY often cause nonsynonymous changes that directly lead to the functional distortion of proteins and the pathological consequences. Herein, we proposed that the inherited mutations found in a MODY family could cause a disturbance of protein abundance, specifically in serum. The serum samples were collected from a Uyghur MODY family through three generations, and the serum proteins after depletion treatment were examined by quantitative proteomics to characterize the MODY-related serum proteins followed by verification using target quantification of proteomics. A total of 32 serum proteins were preliminarily identified as the MODY-related. Further verification test toward the individual samples demonstrated the 12 candidates with the significantly different abundance in the MODY patients. A comparison of the 12 proteins among the sera of type 1 diabetes, type 2 diabetes, MODY, and healthy subjects was conducted and revealed a protein signature related with MODY composed of the serum proteins such as SERPINA7, APOC4, LPA, C6, and F5.

  1. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    Science.gov (United States)

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  2. Towards Dynamic Updates in Service Composition

    Directory of Open Access Journals (Sweden)

    Mario Bravetti

    2015-12-01

    Full Text Available We survey our results about verification of adaptable processes. We present adaptable processes as a way of overcoming the limitations that process calculi have for describing patterns of dynamic process evolution. Such patterns rely on direct ways of controlling the behavior and location of running processes, and so they are at the heart of the adaptation capabilities present in many modern concurrent systems. Adaptable processes have named scopes and are sensible to actions of dynamic update at runtime; this allows to express dynamic and static topologies of adaptable processes as well as different evolvability patterns for concurrent processes. We introduce a core calculus of adaptable processes and consider verification problems for them: first based on specific properties related to error occurrence, that we call bounded and eventual adaptation, and then by considering a simple yet expressive temporal logic over adaptable processes. We provide (undecidability results of such verification problems over adaptable processes considering the spectrum of topologies/evolvability patterns introduced. We then consider distributed adaptability, where a process can update part of a protocol by performing dynamic distributed updates over a set of protocol participants. Dynamic updates in this context are presented as an extension of our work on choreographies and behavioural contracts in multiparty interactions. We show how update mechanisms considered for adaptable processes can be used to extend the theory of choreography and orchestration/contracts, allowing them to be modified at run-time by internal (self-adaptation or external intervention.

  3. Characterizing Resident Space Object Earthshine Signature Variability

    Science.gov (United States)

    Van Cor, Jared D.

    There are three major sources of illumination on objects in the near Earth space environment: Sunshine, Moonshine, and Earthshine. For objects in this environment (satellites, orbital debris, etc.) known as Resident Space Objects (RSOs), the sun and the moon have consistently small illuminating solid angles and can be treated as point sources; this makes their incident illumination easily modeled. The Earth on the other hand has a large illuminating solid angle, is heterogeneous, and is in a constant state of change. The objective of this thesis was to characterize the impact and variability of observed RSO Earthshine on apparent magnitude signatures in the visible optical spectral region. A key component of this research was creating Earth object models incorporating the reflectance properties of the Earth. Two Earth objects were created: a homogeneous diffuse Earth object and a time sensitive heterogeneous Earth object. The homogeneous diffuse Earth object has a reflectance equal to the average global albedo, a standard model used when modeling Earthshine. The time sensitive heterogeneous Earth object was created with two material maps representative of the dynamic reflectance of the surface of the earth, and a shell representative of the atmosphere. NASA's Moderate-resolution Imaging Spectroradiometer (MODIS) Earth observing satellite product libraries, MCD43C1 global surface BRDF map and MOD06 global fractional cloud map, were utilized to create the material maps, and a hybridized version of the Empirical Line Method (ELM) was used to create the atmosphere. This dynamic Earth object was validated by comparing simulated color imagery of the Earth to that taken by: NASAs Earth Polychromatic Imaging Camera (EPIC) located on the Deep Space Climate Observatory (DSCOVR), and by MODIS located on the Terra satellite. The time sensitive heterogeneous Earth object deviated from MODIS imagery by a spectral radiance root mean square error (RMSE) of +/-14.86 [watts/m. 2sr

  4. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  5. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  6. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  7. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  8. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  9. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes

    International Nuclear Information System (INIS)

    Pestana, Rafael Cardoso Baptistini

    2013-01-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO 2 F 2 and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the 235 U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope 235 U, as well as the use of reprocessed material, through the identification of the 236 U isotope. The uncertainties obtained for the n( 235 U)/n( 238 U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  10. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  11. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  12. Off-fault plasticity in three-dimensional dynamic rupture simulations using a modal Discontinuous Galerkin method on unstructured meshes: Implementation, verification, and application

    Science.gov (United States)

    Wollherr, Stephanie; Gabriel, Alice-Agnes; Uphoff, Carsten

    2018-05-01

    The dynamics and potential size of earthquakes depend crucially on rupture transfers between adjacent fault segments. To accurately describe earthquake source dynamics, numerical models can account for realistic fault geometries and rheologies such as nonlinear inelastic processes off the slip interface. We present implementation, verification, and application of off-fault Drucker-Prager plasticity in the open source software SeisSol (www.seissol.org). SeisSol is based on an arbitrary high-order derivative modal Discontinuous Galerkin (ADER-DG) method using unstructured, tetrahedral meshes specifically suited for complex geometries. Two implementation approaches are detailed, modelling plastic failure either employing sub-elemental quadrature points or switching to nodal basis coefficients. At fine fault discretizations the nodal basis approach is up to 6 times more efficient in terms of computational costs while yielding comparable accuracy. Both methods are verified in community benchmark problems and by three dimensional numerical h- and p-refinement studies with heterogeneous initial stresses. We observe no spectral convergence for on-fault quantities with respect to a given reference solution, but rather discuss a limitation to low-order convergence for heterogeneous 3D dynamic rupture problems. For simulations including plasticity, a high fault resolution may be less crucial than commonly assumed, due to the regularization of peak slip rate and an increase of the minimum cohesive zone width. In large-scale dynamic rupture simulations based on the 1992 Landers earthquake, we observe high rupture complexity including reverse slip, direct branching, and dynamic triggering. The spatio-temporal distribution of rupture transfers are altered distinctively by plastic energy absorption, correlated with locations of geometrical fault complexity. Computational cost increases by 7% when accounting for off-fault plasticity in the demonstrating application. Our results

  13. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  14. Implementation of RSA 2048-bit and AES 256-bit with Digital Signature for Secure Electronic Health Record Application

    Directory of Open Access Journals (Sweden)

    Mohamad Ali Sadikin

    2016-10-01

    Full Text Available This research addresses the implementation of encryption and digital signature technique for electronic health record to prevent cybercrime such as robbery, modification and unauthorised access. In this research, RSA 2048-bit algorithm, AES 256-bit and SHA 256 will be implemented in Java programming language. Secure Electronic Health Record Information (SEHR application design is intended to combine given services, such as confidentiality, integrity, authentication, and nonrepudiation. Cryptography is used to ensure the file records and electronic documents for detailed information on the medical past, present and future forecasts that have been given only to the intended patients. The document will be encrypted using an encryption algorithm based on NIST Standard. In the application, there are two schemes, namely the protection and verification scheme. This research uses black-box testing and whitebox testing to test the software input, output, and code without testing the process and design that occurs in the system.We demonstrated the implementation of cryptography in SEHR. The implementation of encryption and digital signature in this research can prevent archive thievery.

  15. Negative branes, supergroups and the signature of spacetime

    Science.gov (United States)

    Dijkgraaf, Robbert; Heidenreich, Ben; Jefferson, Patrick; Vafa, Cumrun

    2018-02-01

    We study the realization of supergroup gauge theories using negative branes in string theory. We show that negative branes are intimately connected with the possibility of timelike compactification and exotic spacetime signatures previously studied by Hull. Isolated negative branes dynamically generate a change in spacetime signature near their worldvolumes, and are related by string dualities to a smooth M-theory geometry with closed timelike curves. Using negative D3-branes, we show that SU(0| N) supergroup theories are holographically dual to an exotic variant of type IIB string theory on {dS}_{3,2}× {\\overline{S}}^5 , for which the emergent dimensions are timelike. Using branes, mirror symmetry and Nekrasov's instanton calculus, all of which agree, we derive the Seiberg-Witten curve for N=2 SU( N | M ) gauge theories. Together with our exploration of holography and string dualities for negative branes, this suggests that supergroup gauge theories may be non-perturbatively well-defined objects, though several puzzles remain.

  16. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  17. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  18. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  19. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  20. 76 FR 30542 - Adult Signature Services

    Science.gov (United States)

    2011-05-26

    ... POSTAL SERVICE 39 CFR Part 111 Adult Signature Services AGENCY: Postal Service\\TM\\. ACTION: Final..., Domestic Mail Manual (DMM[supreg]) 503.8, to add a new extra service called Adult Signature. This new service has two available options: Adult Signature Required and Adult Signature Restricted Delivery. DATES...

  1. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  2. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  3. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  4. 21 CFR 11.50 - Signature manifestations.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature manifestations. 11.50 Section 11.50 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.50 Signature manifestations. (a) Signed electronic...: (1) The printed name of the signer; (2) The date and time when the signature was executed; and (3...

  5. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  6. Toward an Improved Representation of Middle Atmospheric Dynamics Thanks to the ARISE Project

    Science.gov (United States)

    Blanc, E.; Ceranna, L.; Hauchecorne, A.; Charlton-Perez, A.; Marchetti, E.; Evers, L. G.; Kvaerna, T.; Lastovicka, J.; Eliasson, L.; Crosby, N. B.; Blanc-Benon, P.; Le Pichon, A.; Brachet, N.; Pilger, C.; Keckhut, P.; Assink, J. D.; Smets, P. S. M.; Lee, C. F.; Kero, J.; Sindelarova, T.; Kämpfer, N.; Rüfenacht, R.; Farges, T.; Millet, C.; Näsholm, S. P.; Gibbons, S. J.; Espy, P. J.; Hibbins, R. E.; Heinrich, P.; Ripepe, M.; Khaykin, S.; Mze, N.; Chum, J.

    2018-03-01

    This paper reviews recent progress toward understanding the dynamics of the middle atmosphere in the framework of the Atmospheric Dynamics Research InfraStructure in Europe (ARISE) initiative. The middle atmosphere, integrating the stratosphere and mesosphere, is a crucial region which influences tropospheric weather and climate. Enhancing the understanding of middle atmosphere dynamics requires improved measurement of the propagation and breaking of planetary and gravity waves originating in the lowest levels of the atmosphere. Inter-comparison studies have shown large discrepancies between observations and models, especially during unresolved disturbances such as sudden stratospheric warmings for which model accuracy is poorer due to a lack of observational constraints. Correctly predicting the variability of the middle atmosphere can lead to improvements in tropospheric weather forecasts on timescales of weeks to season. The ARISE project integrates different station networks providing observations from ground to the lower thermosphere, including the infrasound system developed for the Comprehensive Nuclear-Test-Ban Treaty verification, the Lidar Network for the Detection of Atmospheric Composition Change, complementary meteor radars, wind radiometers, ionospheric sounders and satellites. This paper presents several examples which show how multi-instrument observations can provide a better description of the vertical dynamics structure of the middle atmosphere, especially during large disturbances such as gravity waves activity and stratospheric warming events. The paper then demonstrates the interest of ARISE data in data assimilation for weather forecasting and re-analyzes the determination of dynamics evolution with climate change and the monitoring of atmospheric extreme events which have an atmospheric signature, such as thunderstorms or volcanic eruptions.

  7. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Directory of Open Access Journals (Sweden)

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  8. 21 CFR 11.70 - Signature/record linking.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature/record linking. 11.70 Section 11.70 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective...

  9. Expressiveness considerations of XML signatures

    DEFF Research Database (Denmark)

    Jensen, Meiko; Meyer, Christopher

    2011-01-01

    XML Signatures are used to protect XML-based Web Service communication against a broad range of attacks related to man-in-the-middle scenarios. However, due to the complexity of the Web Services specification landscape, the task of applying XML Signatures in a robust and reliable manner becomes...... more and more challenging. In this paper, we investigate this issue, describing how an attacker can still interfere with Web Services communication even in the presence of XML Signatures. Additionally, we discuss the interrelation of XML Signatures and XML Encryption, focussing on their security...

  10. Digital Signature Schemes with Complementary Functionality and Applications

    OpenAIRE

    S. N. Kyazhin

    2012-01-01

    Digital signature schemes with additional functionality (an undeniable signature, a signature of the designated confirmee, a signature blind, a group signature, a signature of the additional protection) and examples of their application are considered. These schemes are more practical, effective and useful than schemes of ordinary digital signature.

  11. 17 CFR 12.12 - Signature.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Signature. 12.12 Section 12.12... General Information and Preliminary Consideration of Pleadings § 12.12 Signature. (a) By whom. All... document on behalf of another person. (b) Effect. The signature on any document of any person acting either...

  12. High-speed high-security signatures

    NARCIS (Netherlands)

    Bernstein, D.J.; Duif, N.; Lange, T.; Schwabe, P.; Yang, B.Y.

    2011-01-01

    This paper shows that a $390 mass-market quad-core 2.4GHz Intel Westmere (Xeon E5620) CPU can create 108000 signatures per second and verify 71000 signatures per second on an elliptic curve at a 2128 security level. Public keys are 32 bytes, and signatures are 64 bytes. These performance figures

  13. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  14. New Approaches and New Technologies for the Verification of Nuclear Disarmament

    International Nuclear Information System (INIS)

    Keir, David

    2013-01-01

    ESARDA’s New Approaches/Novel Technologies Working group has recently begun to take a great interest in technology for use in arms control verification, in parallel with a focus on Nuclear Safeguards technology. A topic-based meeting of members of the NA/NT Subgroup was hosted at Joint Research Centre (JRC), ITU-Nuclear Security Unit in Ispra (Italy), to further explore the technical issues and opportunities presented by the need for new approaches and technologies in a future verified nuclear weapons dismantlement regime. Nuclear warheads must contain radioactive material and, by their nature, gamma rays and neutrons are likely to penetrate to the outside of the warhead casing and even metal containers. Therefore radiation signatures should be detectable by appropriate pieces of equipment. For this reason, researchers in the field of technical verification of nuclear warhead dismantlement have studied and developed technologies for Non-Destructive Assay (NDA). This paper presents a generic dismantlement pathway for verified nuclear warhead dismantlement, based on the scenario employed by the UK-Norway initiative for their exercise in 2008/9. Using this as a framework the types of measurement challenge likely to be presented to a verifying inspector are discussed. The problem of intrusiveness of measurements in relation to the issue of proliferative release of classified information about the warhead attributes is discussed and the concept of ‘information barriers is introduced as a possible solution to this issue. A list of candidate technologies for use in verification activities, with or without information barriers is then presented and, since most of these are new or novel approaches to the issue, an already-established system for classifying them – in terms of state of development and complexity of use in this context – is proposed. Finally, the concept of capturing this information as a library of ‘data sheets’, designed for periodic review as

  15. The verification basis of the ESPROSE.m code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Freeman, K.; Chen, X. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the ESPROSE.m code is presented and implemented. The approach consists of a stepwise testing procedure from wave dynamics aspects to explosion coupling at the local level, and culminates with the consideration of propagating explosive events. Each step in turn consists of an array of analytical and experimental tests. The results indicate that, given the premixture composition, the prediction of energetics of large scale explosions in multidimensional geometries is within reach. The main need identified is for constitutive laws for microinteractions with reactor materials; however, reasonably conservative assessments are presently possible. (author)

  16. H–J–B Equations of Optimal Consumption-Investment and Verification Theorems

    Energy Technology Data Exchange (ETDEWEB)

    Nagai, Hideo, E-mail: nagaih@kansai-u.ac.jp [Kansai University, Department of Mathematics, Faculty of Engineering Science (Japan)

    2015-04-15

    We consider a consumption-investment problem on infinite time horizon maximizing discounted expected HARA utility for a general incomplete market model. Based on dynamic programming approach we derive the relevant H–J–B equation and study the existence and uniqueness of the solution to the nonlinear partial differential equation. By using the smooth solution we construct the optimal consumption rate and portfolio strategy and then prove the verification theorems under certain general settings.

  17. H–J–B Equations of Optimal Consumption-Investment and Verification Theorems

    International Nuclear Information System (INIS)

    Nagai, Hideo

    2015-01-01

    We consider a consumption-investment problem on infinite time horizon maximizing discounted expected HARA utility for a general incomplete market model. Based on dynamic programming approach we derive the relevant H–J–B equation and study the existence and uniqueness of the solution to the nonlinear partial differential equation. By using the smooth solution we construct the optimal consumption rate and portfolio strategy and then prove the verification theorems under certain general settings

  18. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  19. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  20. Dynamic simulation of LMFBR systems

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.

    1980-01-01

    This review article focuses on the dynamic analysis of liquid-metal-cooled fast breeder reactor systems in the context of protected transients. Following a brief discussion on various design and simulation approaches, a critical review of various models for in-reactor components, intermediate heat exchangers, heat transport systems and the steam generating system is presented. A brief discussion on choice of fuels as well as core and blanket system designs is also included. Numerical considerations for obtaining system-wide steady-state and transient solutions are discussed, and examples of various system transients are presented. Another area of major interest is verification of phenomenological models. Various steps involved in the code and model verification are briefly outlined. The review concludes by posing some further areas of interest in fast reactor dynamics and safety. (author)

  1. The effects of extrinsic motivation on signature authorship opinions in forensic signature blind trials.

    Science.gov (United States)

    Dewhurst, Tahnee N; Found, Bryan; Ballantyne, Kaye N; Rogers, Doug

    2014-03-01

    Expertise studies in forensic handwriting examination involve comparisons of Forensic Handwriting Examiners' (FHEs) opinions with lay-persons on blind tests. All published studies of this type have reported real and demonstrable skill differences between the specialist and lay groups. However, critics have proposed that any difference shown may be indicative of a lack of motivation on the part of lay participants, rather than a real difference in skill. It has been suggested that qualified FHEs would be inherently more motivated to succeed in blinded validation trials, as their professional reputations could be at risk, should they perform poorly on the task provided. Furthermore, critics suggest that lay-persons would be unlikely to be highly motivated to succeed, as they would have no fear of negative consequences should they perform badly. In an effort to investigate this concern, a blind signature trial was designed and administered to forty lay-persons. Participants were required to compare known (exemplar) signatures of an individual to questioned signatures and asked to express an opinion regarding whether the writer of the known signatures wrote each of the questioned signatures. The questioned signatures comprised a mixture of genuine, disguised and simulated signatures. The forty participants were divided into two separate groupings. Group 'A' were requested to complete the trial as directed and were advised that for each correct answer they would be financially rewarded, for each incorrect answer they would be financially penalized, and for each inconclusive opinion they would receive neither penalty nor reward. Group 'B' was requested to complete the trial as directed, with no mention of financial recompense or penalty. The results of this study do not support the proposition that motivation rather than skill difference is the source of the statistical difference in opinions between individuals' results in blinded signature proficiency trials. Crown

  2. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  3. On wave-packet dynamics in a decaying quadratic potential

    DEFF Research Database (Denmark)

    Møller, Klaus Braagaard; Henriksen, Niels Engholm

    1997-01-01

    We consider the time-dependent Schrodinger equation for a quadratic potential with an exponentially decaying force constant. General analytical solutions are presented and we highlight in particular, the signatures of classical mechanics in the wave packet dynamics.......We consider the time-dependent Schrodinger equation for a quadratic potential with an exponentially decaying force constant. General analytical solutions are presented and we highlight in particular, the signatures of classical mechanics in the wave packet dynamics....

  4. Exotic signatures from supersymmetry

    International Nuclear Information System (INIS)

    Hall, L.J.

    1989-08-01

    Minor changes to the standard supersymmetric model, such as soft flavor violation and R parity violation, cause large changes in the signatures. The origin of these changes and the resulting signatures are discussed. 15 refs., 7 figs., 2 tabs

  5. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  6. Research on a New Signature Scheme on Blockchain

    Directory of Open Access Journals (Sweden)

    Chao Yuan

    2017-01-01

    Full Text Available With the rise of Bitcoin, blockchain which is the core technology of Bitcoin has received increasing attention. Privacy preserving and performance on blockchain are two research points in academia and business, but there are still some unresolved issues in both respects. An aggregate signature scheme is a digital signature that supports making signatures on many different messages generated by many different users. Using aggregate signature, the size of the signature could be shortened by compressing multiple signatures into a single signature. In this paper, a new signature scheme for transactions on blockchain based on the aggregate signature was proposed. It was worth noting that elliptic curve discrete logarithm problem and bilinear maps played major roles in our signature scheme. And the security properties of our signature scheme were proved. In our signature scheme, the amount will be hidden especially in the transactions which contain multiple inputs and outputs. Additionally, the size of the signature on transaction is constant regardless of the number of inputs and outputs that the transaction contains, which can improve the performance of signature. Finally, we gave an application scenario for our signature scheme which aims to achieve the transactions of big data on blockchain.

  7. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  8. Dynamic signatures of driven vortex motion.

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, G. W.; Kwok, W. K.; Lopez, D.; Olsson, R. J.; Paulius, L. M.; Petrean, A. M.; Safar, H.

    1999-09-16

    We probe the dynamic nature of driven vortex motion in superconductors with a new type of transport experiment. An inhomogeneous Lorentz driving force is applied to the sample, inducing vortex velocity gradients that distinguish the hydrodynamic motion of the vortex liquid from the elastic and-plastic motion of the vortex solid. We observe elastic depinning of the vortex lattice at the critical current, and shear induced plastic slip of the lattice at high Lorentz force gradients.

  9. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  10. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  11. Persistence of social signatures in human communication.

    Science.gov (United States)

    Saramäki, Jari; Leicht, E A; López, Eduardo; Roberts, Sam G B; Reed-Tsochas, Felix; Dunbar, Robin I M

    2014-01-21

    The social network maintained by a focal individual, or ego, is intrinsically dynamic and typically exhibits some turnover in membership over time as personal circumstances change. However, the consequences of such changes on the distribution of an ego's network ties are not well understood. Here we use a unique 18-mo dataset that combines mobile phone calls and survey data to track changes in the ego networks and communication patterns of students making the transition from school to university or work. Our analysis reveals that individuals display a distinctive and robust social signature, captured by how interactions are distributed across different alters. Notably, for a given ego, these social signatures tend to persist over time, despite considerable turnover in the identity of alters in the ego network. Thus, as new network members are added, some old network members either are replaced or receive fewer calls, preserving the overall distribution of calls across network members. This is likely to reflect the consequences of finite resources such as the time available for communication, the cognitive and emotional effort required to sustain close relationships, and the ability to make emotional investments.

  12. Dynamic Behavior of a SCARA Robot by using N-E Method for a Straight Line and Simulation of Motion by using Solidworks and Verification by Matlab/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2014-05-01

    Full Text Available SCARA (Selective Compliant Assembly Robot Arm robot of serial architecture is widely used in assembly operations and operations "pick-place", it has been shown that use of robots improves the accuracy of assembly, and saves assembly time and cost as well. The most important condition for the choice of this kind of robot is the dynamic behavior for a given path, no closed solution for the dynamics of this important robot has been reported. This paper presents the study of the kinematics (forward and inverse by using D-H notation and the dynamics of SCARA robot by using N-E methods. A computer code is developed for trajectory generation by using inverse kinematics, and calculates the variations of the torques of the links for a straight line (path rest to rest between two positions for operation "pick-place". SCARA robot is constructed to achieve “pick-place» operation using SolidWorks software. And verification by Matlab/Simulink. The results of simulations were discussed. An agreement between the two softwares is certainly obtained herein

  13. Estimation of numerical uncertainty in computational fluid dynamics simulations of a passively controlled wave energy converter

    DEFF Research Database (Denmark)

    Wang, Weizhi; Wu, Minghao; Palm, Johannes

    2018-01-01

    for almost linear incident waves. First, we show that the computational fluid dynamics simulations have acceptable agreement to experimental data. We then present a verification and validation study focusing on the solution verification covering spatial and temporal discretization, iterative and domain......The wave loads and the resulting motions of floating wave energy converters are traditionally computed using linear radiation–diffraction methods. Yet for certain cases such as survival conditions, phase control and wave energy converters operating in the resonance region, more complete...... dynamics simulations have largely been overlooked in the wave energy sector. In this article, we apply formal verification and validation techniques to computational fluid dynamics simulations of a passively controlled point absorber. The phase control causes the motion response to be highly nonlinear even...

  14. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  15. Securing optical code-division multiple-access networks with a postswitching coding scheme of signature reconfiguration

    Science.gov (United States)

    Huang, Jen-Fa; Meng, Sheng-Hui; Lin, Ying-Chen

    2014-11-01

    The optical code-division multiple-access (OCDMA) technique is considered a good candidate for providing optical layer security. An enhanced OCDMA network security mechanism with a pseudonoise (PN) random digital signals type of maximal-length sequence (M-sequence) code switching to protect against eavesdropping is presented. Signature codes unique to individual OCDMA-network users are reconfigured according to the register state of the controlling electrical shift registers. Examples of signature reconfiguration following state switching of the controlling shift register for both the network user and the eavesdropper are numerically illustrated. Dynamically changing the PN state of the shift register to reconfigure the user signature sequence is shown; this hinders eavesdroppers' efforts to decode correct data sequences. The proposed scheme increases the probability of eavesdroppers committing errors in decoding and thereby substantially enhances the degree of an OCDMA network's confidentiality.

  16. Vibrational signatures of cation-anion hydrogen bonding in ionic liquids: a periodic density functional theory and molecular dynamics study.

    Science.gov (United States)

    Mondal, Anirban; Balasubramanian, Sundaram

    2015-02-05

    Hydrogen bonding in alkylammonium based protic ionic liquids was studied using density functional theory (DFT) and ab initio molecular dynamics (AIMD) simulations. Normal-mode analysis within the harmonic approximation and power spectra of velocity autocorrelation functions were used as tools to obtain the vibrational spectra in both the gas phase and the crystalline phases of these protic ionic liquids. The hydrogen bond vibrational modes were identified in the 150-240 cm(-1) region of the far-infrared (far-IR) spectra. A blue shift in the far-IR mode was observed with an increasing number of hydrogen-bonding sites on the cation; the exact peak position is modulated by the cation-anion hydrogen bond strength. Sub-100 cm(-1) bands in the far-IR spectrum are assigned to the rattling motion of the anions. Calculated NMR chemical shifts of the acidic protons in the crystalline phase of these salts also exhibit the signature of cation-anion hydrogen bonding.

  17. Real time gamma-ray signature identifier

    Science.gov (United States)

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  18. Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption

    Science.gov (United States)

    Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.

    2018-04-01

    This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.

  19. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  20. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  1. ''Electron Conic'' Signatures observed in the nightside auroral zone and over the polar cap

    International Nuclear Information System (INIS)

    Menietti, J.D.; Burch, J.L.

    1985-01-01

    A preliminary search of the Dynamics Explorer 1 high-altitude plasma instrument data base has yielded examples of ''electron conic'' signatures. The three example passes show an association with regions of downward electron acceleration and upward ion beams, but this is not true of all the electron conic events. The electron conic signatures are clearly discernible on energy-flux-versus-time color spectrograms as pairs of discrete vertical bands which are symmetric about a pitch angle of approximately 180 0 . One of the examples is a polar cap pass with electron conic signatures observed at invariant latitudes from 84 0 to 75 0 . The other two cases are nightside auroral zone passes in which the regions of detectable electron conics are spatially more confined, covering only about 1 0 in invariant latitude. The conic signatures have been found at energies that range from 50 eV 0 is larger than expected for a loss cone feature. If the electrons conserve the first adiabatic invariant in a dipole magnetic field, and in some cases a parallel electric field, the mirroring altitude varies between about 500 km and 8000 km, which is above the atmospheric loss region. For this reason, and in analogy with the formation of ion conics, we suggest that the conic signatures are produced by heating of the electrons perpendicular to the magnetic field

  2. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  3. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  4. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  5. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  6. 42 CFR 424.36 - Signature requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Signature requirements. 424.36 Section 424.36... (CONTINUED) MEDICARE PROGRAM CONDITIONS FOR MEDICARE PAYMENT Claims for Payment § 424.36 Signature requirements. (a) General rule. The beneficiary's own signature is required on the claim unless the beneficiary...

  7. Unsupervised signature extraction from forensic logs

    NARCIS (Netherlands)

    Thaler, S.M.; Menkovski, V.; Petkovic, M.; Altun, Y.; Das, K.; Mielikäinen, T.; Malerba, D.; Stefanowski, J.; Read, J.; Žitnik, M.; Ceci, M.

    2017-01-01

    Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

  8. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  9. Development of Neutron Energy Spectral Signatures for Passive Monitoring of Spent Nuclear Fuels in Dry Cask Storage

    Science.gov (United States)

    Harkness, Ira; Zhu, Ting; Liang, Yinong; Rauch, Eric; Enqvist, Andreas; Jordan, Kelly A.

    2018-01-01

    Demand for spent nuclear fuel dry casks as an interim storage solution has increased globally and the IAEA has expressed a need for robust safeguards and verification technologies for ensuring the continuity of knowledge and the integrity of radioactive materials inside spent fuel casks. Existing research has been focusing on "fingerprinting" casks based on count rate statistics to represent radiation emission signatures. The current research aims to expand to include neutron energy spectral information as part of the fuel characteristics. First, spent fuel composition data are taken from the Next Generation Safeguards Initiative Spent Fuel Libraries, representative for Westinghouse 17ˣ17 PWR assemblies. The ORIGEN-S code then calculates the spontaneous fission and (α,n) emissions for individual fuel rods, followed by detailed MCNP simulations of neutrons transported through the fuel assemblies. A comprehensive database of neutron energy spectral profiles is to be constructed, with different enrichment, burn-up, and cooling time conditions. The end goal is to utilize the computational spent fuel library, predictive algorithm, and a pressurized 4He scintillator to verify the spent fuel assemblies inside a cask. This work identifies neutron spectral signatures that correlate with the cooling time of spent fuel. Both the total and relative contributions from spontaneous fission and (α,n) change noticeably with respect to cooling time, due to the relatively short half-life (18 years) of the major neutron source 244Cm. Identification of this and other neutron spectral signatures allows the characterization of spent nuclear fuels in dry cask storage.

  10. 7 CFR 718.9 - Signature requirements.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Signature requirements. 718.9 Section 718.9... MULTIPLE PROGRAMS General Provisions § 718.9 Signature requirements. (a) When a program authorized by this chapter or Chapter XIV of this title requires the signature of a producer; landowner; landlord; or tenant...

  11. 27 CFR 17.6 - Signature authority.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Signature authority. 17.6... PRODUCTS General Provisions § 17.6 Signature authority. No claim, bond, tax return, or other required... other proper notification of signature authority has been filed with the TTB office where the required...

  12. 48 CFR 804.101 - Contracting officer's signature.

    Science.gov (United States)

    2010-10-01

    ... signature. 804.101 Section 804.101 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.101 Contracting officer's signature. (a) If a... signature. ...

  13. Observational Signatures of Transverse Magnetohydrodynamic Waves and Associated Dynamic Instabilities in Coronal Flux Tubes

    Energy Technology Data Exchange (ETDEWEB)

    Antolin, P.; Moortel, I. De [School of Mathematics and Statistics, University of St. Andrews, St. Andrews, Fife KY16 9SS (United Kingdom); Doorsselaere, T. Van [Centre for mathematical Plasma Astrophysics, Mathematics Department, KU Leuven, Celestijnenlaan 200B bus 2400, B-3001 Leuven (Belgium); Yokoyama, T., E-mail: patrick.antolin@st-andrews.ac.uk [Department of Earth and Planetary Science, The University of Tokyo, Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2017-02-20

    Magnetohydrodynamic (MHD) waves permeate the solar atmosphere and constitute potential coronal heating agents. Yet, the waves detected so far may be but a small subset of the true existing wave power. Detection is limited by instrumental constraints but also by wave processes that localize the wave power in undetectable spatial scales. In this study, we conduct 3D MHD simulations and forward modeling of standing transverse MHD waves in coronal loops with uniform and non-uniform temperature variation in the perpendicular cross-section. The observed signatures are largely dominated by the combination of the Kelvin–Helmholtz instability (KHI), resonant absorption, and phase mixing. In the presence of a cross-loop temperature gradient, we find that emission lines sensitive to the loop core catch different signatures compared to those that are more sensitive to the loop boundary and the surrounding corona, leading to an out-of-phase intensity and Doppler velocity modulation produced by KHI mixing. In all of the considered models, common signatures include an intensity and loop width modulation at half the kink period, a fine strand-like structure, a characteristic arrow-shaped structure in the Doppler maps, and overall line broadening in time but particularly at the loop edges. For our model, most of these features can be captured with a spatial resolution of 0.″33 and a spectral resolution of 25 km s{sup −1}, although we do obtain severe over-estimation of the line width. Resonant absorption leads to a significant decrease of the observed kinetic energy from Doppler motions over time, which is not recovered by a corresponding increase in the line width from phase mixing and KHI motions. We estimate this hidden wave energy to be a factor of 5–10 of the observed value.

  14. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  15. European Train Control System: A Case Study in Formal Verification

    Science.gov (United States)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  16. Perancangan Aplikasi Undeniable Digital Signature Dengan Algoritma Chaum’s Blind Signature

    OpenAIRE

    Simanjuntak, Martin Dennain

    2012-01-01

    Desperaty need a securiry system in the exchange of information via computer media, so that information can not be accessed by unauthorized parties. One of the security system is to use a system of digital signatures as a means of authenticating the authenticity of digital document that are exchanged. By using a digital a digital signature system is undeniable, the security system can be generated digital document exchange, where the system is free from the from of rejection...

  17. Practical quantum digital signature

    Science.gov (United States)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  18. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  19. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  20. 25 CFR 213.10 - Lessor's signature.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Lessor's signature. 213.10 Section 213.10 Indians BUREAU... MEMBERS OF FIVE CIVILIZED TRIBES, OKLAHOMA, FOR MINING How to Acquire Leases § 213.10 Lessor's signature... thumbprint which shall be designated as “right” or “left” thumbmark. Such signatures must be witnessed by two...

  1. Initial Semantics for Strengthened Signatures

    Directory of Open Access Journals (Sweden)

    André Hirschowitz

    2012-02-01

    Full Text Available We give a new general definition of arity, yielding the companion notions of signature and associated syntax. This setting is modular in the sense requested by Ghani and Uustalu: merging two extensions of syntax corresponds to building an amalgamated sum. These signatures are too general in the sense that we are not able to prove the existence of an associated syntax in this general context. So we have to select arities and signatures for which there exists the desired initial monad. For this, we follow a track opened by Matthes and Uustalu: we introduce a notion of strengthened arity and prove that the corresponding signatures have initial semantics (i.e. associated syntax. Our strengthened arities admit colimits, which allows the treatment of the λ-calculus with explicit substitution.

  2. Spectral signature selection for mapping unvegetated soils

    Science.gov (United States)

    May, G. A.; Petersen, G. W.

    1975-01-01

    Airborne multispectral scanner data covering the wavelength interval from 0.40-2.60 microns were collected at an altitude of 1000 m above the terrain in southeastern Pennsylvania. Uniform training areas were selected within three sites from this flightline. Soil samples were collected from each site and a procedure developed to allow assignment of scan line and element number from the multispectral scanner data to each sampling location. These soil samples were analyzed on a spectrophotometer and laboratory spectral signatures were derived. After correcting for solar radiation and atmospheric attenuation, the laboratory signatures were compared to the spectral signatures derived from these same soils using multispectral scanner data. Both signatures were used in supervised and unsupervised classification routines. Computer-generated maps using the laboratory and multispectral scanner derived signatures resulted in maps that were similar to maps resulting from field surveys. Approximately 90% agreement was obtained between classification maps produced using multispectral scanner derived signatures and laboratory derived signatures.

  3. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures.

    Science.gov (United States)

    Pride, David T; Schoenfeld, Thomas

    2008-09-17

    Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC), where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of the Octopus and Bear Paw metagenomic contigs

  4. Signatures de l'invisible

    CERN Multimedia

    CERN Press Office. Geneva

    2000-01-01

    "Signatures of the Invisible" is an unique collaboration between contemporary artists and contemporary physicists which has the potential to help redefine the relationship between science and art. "Signatures of the Invisible" is jointly organised by the London Institute - the world's largest college of art and design and CERN*, the world's leading particle physics laboratory. 12 leading visual artists:

  5. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  6. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry

    International Nuclear Information System (INIS)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-01-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  7. A group signature scheme based on quantum teleportation

    International Nuclear Information System (INIS)

    Wen Xiaojun; Tian Yuan; Ji Liping; Niu Xiamu

    2010-01-01

    In this paper, we present a group signature scheme using quantum teleportation. Different from classical group signature and current quantum signature schemes, which could only deliver either group signature or unconditional security, our scheme guarantees both by adopting quantum key preparation, quantum encryption algorithm and quantum teleportation. Security analysis proved that our scheme has the characteristics of group signature, non-counterfeit, non-disavowal, blindness and traceability. Our quantum group signature scheme has a foreseeable application in the e-payment system, e-government, e-business, etc.

  8. A group signature scheme based on quantum teleportation

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xiaojun; Tian Yuan; Ji Liping; Niu Xiamu, E-mail: wxjun36@gmail.co [Information Countermeasure Technique Research Institute, Harbin Institute of Technology, Harbin 150001 (China)

    2010-05-01

    In this paper, we present a group signature scheme using quantum teleportation. Different from classical group signature and current quantum signature schemes, which could only deliver either group signature or unconditional security, our scheme guarantees both by adopting quantum key preparation, quantum encryption algorithm and quantum teleportation. Security analysis proved that our scheme has the characteristics of group signature, non-counterfeit, non-disavowal, blindness and traceability. Our quantum group signature scheme has a foreseeable application in the e-payment system, e-government, e-business, etc.

  9. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  10. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  11. Maximizing biomarker discovery by minimizing gene signatures

    Directory of Open Access Journals (Sweden)

    Chang Chang

    2011-12-01

    Full Text Available Abstract Background The use of gene signatures can potentially be of considerable value in the field of clinical diagnosis. However, gene signatures defined with different methods can be quite various even when applied the same disease and the same endpoint. Previous studies have shown that the correct selection of subsets of genes from microarray data is key for the accurate classification of disease phenotypes, and a number of methods have been proposed for the purpose. However, these methods refine the subsets by only considering each single feature, and they do not confirm the association between the genes identified in each gene signature and the phenotype of the disease. We proposed an innovative new method termed Minimize Feature's Size (MFS based on multiple level similarity analyses and association between the genes and disease for breast cancer endpoints by comparing classifier models generated from the second phase of MicroArray Quality Control (MAQC-II, trying to develop effective meta-analysis strategies to transform the MAQC-II signatures into a robust and reliable set of biomarker for clinical applications. Results We analyzed the similarity of the multiple gene signatures in an endpoint and between the two endpoints of breast cancer at probe and gene levels, the results indicate that disease-related genes can be preferably selected as the components of gene signature, and that the gene signatures for the two endpoints could be interchangeable. The minimized signatures were built at probe level by using MFS for each endpoint. By applying the approach, we generated a much smaller set of gene signature with the similar predictive power compared with those gene signatures from MAQC-II. Conclusions Our results indicate that gene signatures of both large and small sizes could perform equally well in clinical applications. Besides, consistency and biological significances can be detected among different gene signatures, reflecting the

  12. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  13. Lattice-Based Revocable Certificateless Signature

    Directory of Open Access Journals (Sweden)

    Ying-Hao Hung

    2017-10-01

    Full Text Available Certificateless signatures (CLS are noticeable because they may resolve the key escrow problem in ID-based signatures and break away the management problem regarding certificate in conventional signatures. However, the security of the mostly previous CLS schemes relies on the difficulty of solving discrete logarithm or large integer factorization problems. These two problems would be solved by quantum computers in the future so that the signature schemes based on them will also become insecure. For post-quantum cryptography, lattice-based cryptography is significant due to its efficiency and security. However, no study on addressing the revocation problem in the existing lattice-based CLS schemes is presented. In this paper, we focus on the revocation issue and present the first revocable CLS (RCLS scheme over lattices. Based on the short integer solution (SIS assumption over lattices, the proposed lattice-based RCLS scheme is shown to be existential unforgeability against adaptive chosen message attacks. By performance analysis and comparisons, the proposed lattice-based RCLS scheme is better than the previously proposed lattice-based CLS scheme, in terms of private key size, signature length and the revocation mechanism.

  14. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  15. Corticosteroid receptors adopt distinct cyclical transcriptional signatures.

    Science.gov (United States)

    Le Billan, Florian; Amazit, Larbi; Bleakley, Kevin; Xue, Qiong-Yao; Pussard, Eric; Lhadj, Christophe; Kolkhof, Peter; Viengchareun, Say; Fagart, Jérôme; Lombès, Marc

    2018-05-07

    Mineralocorticoid receptors (MRs) and glucocorticoid receptors (GRs) are two closely related hormone-activated transcription factors that regulate major pathophysiologic functions. High homology between these receptors accounts for the crossbinding of their corresponding ligands, MR being activated by both aldosterone and cortisol and GR essentially activated by cortisol. Their coexpression and ability to bind similar DNA motifs highlight the need to investigate their respective contributions to overall corticosteroid signaling. Here, we decipher the transcriptional regulatory mechanisms that underlie selective effects of MRs and GRs on shared genomic targets in a human renal cellular model. Kinetic, serial, and sequential chromatin immunoprecipitation approaches were performed on the period circadian protein 1 ( PER1) target gene, providing evidence that both receptors dynamically and cyclically interact at the same target promoter in a specific and distinct transcriptional signature. During this process, both receptors regulate PER1 gene by binding as homo- or heterodimers to the same promoter region. Our results suggest a novel level of MR-GR target gene regulation, which should be considered for a better and integrated understanding of corticosteroid-related pathophysiology.-Le Billan, F., Amazit, L., Bleakley, K., Xue, Q.-Y., Pussard, E., Lhadj, C., Kolkhof, P., Viengchareun, S., Fagart, J., Lombès, M. Corticosteroid receptors adopt distinct cyclical transcriptional signatures.

  16. The impact of gyre dynamics on the mid-depth salinity signature of the eastern North Atlantic

    Science.gov (United States)

    Burkholder, K. C.; Lozier, M. S.

    2009-04-01

    The Mediterranean Overflow Water (MOW) is widely recognized for its role in establishing the mid-depth salinity signature of the subtropical North Atlantic. However, recent work has revealed an intermittent impact of MOW on the salinity signature of the eastern subpolar basin. This impact results from a temporally variable penetration of the northward flowing branch of the MOW past Porcupine Bank into the eastern subpolar basin. It has been shown that the salinity signature of the eastern subpolar basin, in particular the Rockall Trough, varies with the state of the North Atlantic Oscillation (NAO): during persistent periods of strong winds (high NAO index), when the subpolar front moves eastward, waters in the subpolar gyre block the northward flowing MOW, preventing its entry into the subpolar gyre. Conversely, during persistent periods of weak winds (low NAO index), the front moves westward, allowing MOW to penetrate north of Porcupine Bank and into the subpolar gyre. Here, we investigate the manner in which the spatial and temporal variability in the northward penetration of the MOW and the position of the eastern limb of the subpolar front affect the mid-depth property fields not only in the subpolar gyre, but in the subtropical gyre as well. Using approximately 55 years of historical hydrographic data and output from the 1/12° FLAME model, we analyze the temporal variability of salinity along the eastern boundary and compare this variability to the position of the subpolar front in both the observational record and the FLAME model. We conclude that when the zonal position of the subpolar front moves relatively far offshore and the MOW is able to penetrate to the north, high salinity anomalies are observed at high latitudes and low salinity anomalies are observed at low latitudes. Conversely, when the frontal position shifts to the east, the MOW (and thus, the high salinity signature) is blocked, resulting in a drop in salinity anomalies at high latitudes

  17. A Signature of Inflation from Dynamical Supersymmetry Breaking

    CERN Document Server

    Kinney, W H; Kinney, William H.; Riotto, Antonio

    1998-01-01

    In models of cosmological inflation motivated by dynamical supersymmetry breaking, the potential driving inflation may be characterized by inverse powers of a scalar field. These models produce observables similar to those typical of the hybrid inflation scenario: negligible production of tensor (gravitational wave) modes, and a blue scalar spectral index. In this short note, we show that, unlike standard hybrid inflation models, dynamical supersymmetric inflation (DSI) predicts a measurable deviation from a power-law spectrum of fluctuations, with a variation in the scalar spectral index $|dn / d(\\ln k)|$ may be as large as 0.05. DSI can be observationally distinguished from other hybrid models with cosmic microwave background measurements of the planned sensitivity of the ESA's Planck Surveyor.

  18. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  19. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  20. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  1. General Conversion for Obtaining Strongly Existentially Unforgeable Signatures

    Science.gov (United States)

    Teranishi, Isamu; Oyama, Takuro; Ogata, Wakaha

    We say that a signature scheme is strongly existentially unforgeable (SEU) if no adversary, given message/signature pairs adaptively, can generate a signature on a new message or a new signature on a previously signed message. We propose a general and efficient conversion in the standard model that transforms a secure signature scheme to SEU signature scheme. In order to construct that conversion, we use a chameleon commitment scheme. Here a chameleon commitment scheme is a variant of commitment scheme such that one can change the committed value after publishing the commitment if one knows the secret key. We define the chosen message security notion for the chameleon commitment scheme, and show that the signature scheme transformed by our proposed conversion satisfies the SEU property if the chameleon commitment scheme is chosen message secure. By modifying the proposed conversion, we also give a general and efficient conversion in the random oracle model, that transforms a secure signature scheme into a SEU signature scheme. This second conversion also uses a chameleon commitment scheme but only requires the key only attack security for it.

  2. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  3. Radar micro-doppler signatures processing and applications

    CERN Document Server

    Chen, Victor C; Miceli, William J

    2014-01-01

    Radar Micro-Doppler Signatures: Processing and applications concentrates on the processing and application of radar micro-Doppler signatures in real world situations, providing readers with a good working knowledge on a variety of applications of radar micro-Doppler signatures.

  4. Electroweak and flavor dynamics at hadron colliders - I

    International Nuclear Information System (INIS)

    Elchtent, E.; Lane, K.

    1998-02-01

    This is the first of two reports cataloging the principal signatures of electroweak and flavor dynamics at anti pp and pp colliders. Here, we discuss some of the signatures of dynamical electroweak and flavor symmetry breaking. The framework for dynamical symmetry breaking we assume is technicolor, with a walking coupling α TC , and extended technicolor. The reactions discussed occur mainly at subprocess energies √s approx-lt 1 TeV. They include production of color-singlet and octet technirhos and their decay into pairs of technipions, longitudinal weak bosons, or jets. Technipions, in turn, decay predominantly into heavy fermions. This report will appear in the Proceedings of the 1996 DPF/DPB Summer Study on New Directions for High Energy Physics (Snowmass 96)

  5. From Anosov dynamics to hyperbolic attractors

    Indian Academy of Sciences (India)

    the dynamics on the attractive sets of the self-oscillatory systems and for the original Anosov geodesic flow. The hyperbolic nature ... Hyperbolic theory is a branch of the theory of dynami- ..... Figure 5. Verification of the hyperbolicity criterion for.

  6. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  7. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  8. Physics Signatures at CLIC

    CERN Document Server

    Battaglia, Marco

    2001-01-01

    A set of signatures for physics processes of potential interests for the CLIC programme at = 1 - 5 TeV are discussed. These signatures, that may correspond to the manifestation of different scenarios of new physics as well as to Standard Model precision tests, are proposed as benchmarks for the optimisation of the CLIC accelerator parameters and for a first definition of the required detector response.

  9. Experimental verification of the new RISOe-A1 airfoil family for wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Dahl, K S; Fuglsang, P; Antoniou, I [Risoe National Lab., Roskilde (Denmark)

    1999-03-01

    This paper concerns the experimental verification of a new airfoil family for wind turbines. The family consist of airfoils in the relative thickness range from 15% to 30%. Three airfoils, Risoe-A1-18, Risoe-A1-21, and Risoe-A1-24 were tested in a wind tunnel. The verification consisted of both static and dynamic measurements. Here, the static results are presented for a Reynolds number of 1.6x10{sup 6} for the following airfoil configurations: smooth surface (all three airfoils) and Risoe-A1-24 mounted with leading edge roughness, vortex generators, and Gurney-flaps, respectively. All three airfoils have constant lift curve slope and almost constant drag coefficient until the maximum lift coefficient of about 1.4 is reached. The experimental results are compared with corresponding computational from the general purpose flow solver, EllipSys2D, showing good agreement. (au)

  10. Additional signature of the dynamical Casimir effect in a superconducting circuit

    International Nuclear Information System (INIS)

    Rego, Andreson L.C.; Farina, C.; Silva, Hector O.; Alves, Danilo T.

    2013-01-01

    Full text: The dynamical Casimir effect (DCE) is one of the most fascinating quantum vacuum effects that consists, essentially, on the particle creation as a result of the interaction between a quantized field and a moving mirror. In this sense, particle creation due to external time-dependent potentials or backgrounds, or even time dependent electromagnetic properties of a material medium can also be included in a general definition of DCE. For simplicity, this interaction is simulated, in general, by means of idealized boundary conditions (BC). As a consequence of the particle creation, the moving mirror experiences a dissipative radiation reaction force acting on it. In order to generate an appreciable number of photons to be observed, the DCE was investigated in other contexts, as for example, in the circuit quantum electrodynamics. This theory predicted high photon creation rate by the modulation of the length of an open transmission line coupled to a superconducting quantum interference device (SQUID), an extremely sensitive magnetometer (J.R. Johansson et al, 2009/2010). A time dependent magnetic flux can be applied to the SQUID changing its inductance, leading to a time-dependent BC which simulates a moving boundary It was in the last scenario that the first observation of the DCE was announced by Wilson and collaborators (Wilson et al, 2011). Taking as motivation the experiment that observed the DCE, we investigate the influence of the generalized time-dependent Robin BC, that presents an extra term involving the second order time derivative of the field, in the particle creation via DCE. This kind of BC may appear quite naturally in the context of circuit quantum electrodynamics and the extra term was neglected in the theoretical aspects of the first observation of the DCE. Appropriate adjustments of this new parameter can not only enhance the total number of created particles but also give rise to a non-parabolic shape of the particle creation spectral

  11. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  12. DYNAMIC SUFFICIENCY OF THE MAGNETICALLY SUSPENDED TRAIN

    Directory of Open Access Journals (Sweden)

    V. A. Polyakov

    2013-11-01

    Full Text Available Purpose. The basic criterion of the magnetically suspended train's consumer estimation is a quality of its mechanical motion. This motion is realized in unpredictable conditions and, for purposefulness preservation, should adapt to them. Such adaptation is possible only within the limits of system’s dynamic sufficiency. Sufficiency is understood as presence at system of resources, which allow one to realize its demanded motions without violating actual restrictions. Therefore presence of such resources is a necessary condition of preservation of required purposefulness of train's dynamics, and verification of the mentioned sufficiency is the major component of this dynamic research. Methodology. Methods of the set theory are used in work. Desirable and actual approachability spaces of the train are found. The train is considered dynamically sufficient in zones of the specified spaces overlapping. Findings. Within the limits of the accepted treatment of train's dynamic sufficiency, verification of its presence, as well as a stock (or deficiency of preservations can be executed by the search and the subsequent estimation of such overlapping zones. Operatively (directly during motion it can be realized on the train's ODC with use, for example, of computer mathematics system Mathematica. It possesses extensive opportunities of highly efficient and, at the same time, demanding an expense concerning small resources information manipulation. The efficiency of using of created technique is illustrated on an example of vehicle's acceleration research. Calculation is executed with use of the constructed computer model of interaction of an independent traction electromagnetic subsystem of an artifact with its mechanical subsystem. Originality. The technique of verification of the high-speed magnetically suspended train's dynamic sufficiency is developed. The technique is highly efficient, it provides sufficient presentation and demands an expense of the

  13. A signature of attractor dynamics in the CA3 region of the hippocampus.

    Directory of Open Access Journals (Sweden)

    César Rennó-Costa

    2014-05-01

    Full Text Available The notion of attractor networks is the leading hypothesis for how associative memories are stored and recalled. A defining anatomical feature of such networks is excitatory recurrent connections. These "attract" the firing pattern of the network to a stored pattern, even when the external input is incomplete (pattern completion. The CA3 region of the hippocampus has been postulated to be such an attractor network; however, the experimental evidence has been ambiguous, leading to the suggestion that CA3 is not an attractor network. In order to resolve this controversy and to better understand how CA3 functions, we simulated CA3 and its input structures. In our simulation, we could reproduce critical experimental results and establish the criteria for identifying attractor properties. Notably, under conditions in which there is continuous input, the output should be "attracted" to a stored pattern. However, contrary to previous expectations, as a pattern is gradually "morphed" from one stored pattern to another, a sharp transition between output patterns is not expected. The observed firing patterns of CA3 meet these criteria and can be quantitatively accounted for by our model. Notably, as morphing proceeds, the activity pattern in the dentate gyrus changes; in contrast, the activity pattern in the downstream CA3 network is attracted to a stored pattern and thus undergoes little change. We furthermore show that other aspects of the observed firing patterns can be explained by learning that occurs during behavioral testing. The CA3 thus displays both the learning and recall signatures of an attractor network. These observations, taken together with existing anatomical and behavioral evidence, make the strong case that CA3 constructs associative memories based on attractor dynamics.

  14. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  15. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures

    Directory of Open Access Journals (Sweden)

    Pride David T

    2008-09-01

    Full Text Available Abstract Background Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC, where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. Results From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of

  16. Dosimetric and qualitative analysis of kinetic properties of millennium 80 multileaf collimator system for dynamic intensity modulated radiotherapy treatments

    Directory of Open Access Journals (Sweden)

    Bhardwaj Anup

    2007-01-01

    Full Text Available The aim of this paper is to analyze the positional accuracy, kinetic properties of the dynamic multileaf collimator (MLC and dosimetric evaluation of fractional dose delivery for the intensity modulated radiotherapy (IMRT for step and shoot and sliding window (dynamic techniques of Varian multileaf collimator millennium 80. Various quality assurance tests such as accuracy in leaf positioning and speed, stability of dynamic MLC output, inter and intra leaf transmission, dosimetric leaf separation and multiple carriage field verification were performed. Evaluation of standard field patterns as pyramid, peaks, wedge, chair, garden fence test, picket fence test and sweeping gap output was done. Patient dose quality assurance procedure consists of an absolute dose measurement for all fields at 5 cm depth on solid water phantom using 0.6cc water proof ion chamber and relative dose verification using Kodak EDR-2 films for all treatment fields along transverse and coronal direction using IMRT phantom. The relative dose verification was performed using Omni Pro IMRT film verification software. The tests performed showed acceptable results for commissioning the millennium 80 MLC and Clinac DHX for dynamic and step and shoot IMRT treatments.

  17. 48 CFR 204.101 - Contracting officer's signature.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contracting officer's signature. 204.101 Section 204.101 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... officer's signature. Follow the procedures at PGI 204.101 for signature of contract documents. [71 FR 9268...

  18. Dosimetric parameters of enhanced dynamic wedge for treatment planning and verification

    International Nuclear Information System (INIS)

    Leavitt, Dennis D.; Lee, Wing Lok; Gaffney, David K.

    1996-01-01

    Purpose/Objective: Enhanced Dynamic Wedge (EDW) is an intensity-modulated radiotherapy technique in which one collimating jaw sweeps across the field to define a desired wedge dose distribution while dose rate is modified according to jaw position. This tool enables discrete or continuous wedge angles from zero to sixty degrees for field widths from three cm to 30 cm in the direction of the wedge, and up to 40 cm perpendicular to the wedge direction. Additionally, asymmetric wedge fields not centered on the line through isocenter can be created for applications such as tangential breast irradiation. The unique range of field shapes and wedge angles introduce a new set of dosimetric challenges to be resolved before routine clinical use of EDW, and especially require that a simple set of independent dose calculation and verification techniques be developed to check computerized treatment planning results. Using terminology in common use in treatment planning, this work defines the effective wedge factor vs. field width and wedge angle, evaluates the depth dose vs. open field values, defines primary intensity functions from which specific dynamic wedges can be calculated in treatment planning systems, and describes the technique for independent calculation of Monitor Units for EDW fields. Materials and Methods: Using 6- and 18-MV beams from a CI2100C, EDW beam profiles were measured in water phantom for depths from near-surface to 30 cm for the full range of field widths and wedge angles using a linear detector array of 25 energy-compensated diodes. Asymmetric wedge field profiles were likewise measured. Depth doses were measured in water phantom using an ionization chamber sequentially positioned to depths of 30 cm. Effective wedge factors for the full range of field widths and wedge angles were measured using an ionization chamber in water-equivalent plastic at a depth of 10 cm on central axis. Dose profiles were calculated by computer as the summation of a series

  19. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  20. Verification of the Wind Response of a Stack Structure

    Directory of Open Access Journals (Sweden)

    D. Makovička

    2003-01-01

    Full Text Available This paper deals with verification analysis of the wind response of a power plant stack structure. Over a period two weeks the actual history of the dynamic response of the structure, and the direction and intensity of the actual wind load was measured, reported and processed with the use of a computer. The resulting data was used to verify the design stage data of the structure, with the natural frequencies and modes assumed by the design and with the dominant effect of other sources on the site. In conclusion the standard requirements are compared with the actual results of measurements and their expansion to the design load.

  1. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    Science.gov (United States)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  2. STAR-CCM+ Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methods (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.

  3. Can specific transcriptional regulators assemble a universal cancer signature?

    Science.gov (United States)

    Roy, Janine; Isik, Zerrin; Pilarsky, Christian; Schroeder, Michael

    2013-10-01

    Recently, there is a lot of interest in using biomarker signatures derived from gene expression data to predict cancer progression. We assembled signatures of 25 published datasets covering 13 types of cancers. How do these signatures compare with each other? On one hand signatures answering the same biological question should overlap, whereas signatures predicting different cancer types should differ. On the other hand, there could also be a Universal Cancer Signature that is predictive independently of the cancer type. Initially, we generate signatures for all datasets using classical approaches such as t-test and fold change and then, we explore signatures resulting from a network-based method, that applies the random surfer model of Google's PageRank algorithm. We show that the signatures as published by the authors and the signatures generated with classical methods do not overlap - not even for the same cancer type - whereas the network-based signatures strongly overlap. Selecting 10 out of 37 universal cancer genes gives the optimal prediction for all cancers thus taking a first step towards a Universal Cancer Signature. We furthermore analyze and discuss the involved genes in terms of the Hallmarks of cancer and in particular single out SP1, JUN/FOS and NFKB1 and examine their specific role in cancer progression.

  4. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  5. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  6. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  7. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  8. 36 CFR 1150.22 - Signature of documents.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Signature of documents. 1150.22 Section 1150.22 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS... Documents for Proceedings on Citations § 1150.22 Signature of documents. The signature of a party...

  9. Universal Nonequilibrium Signatures of Majorana Zero Modes in Quench Dynamics

    Directory of Open Access Journals (Sweden)

    R. Vasseur

    2014-10-01

    Full Text Available The quantum evolution that occurs after a metallic lead is suddenly connected to an electron system contains information about the excitation spectrum of the combined system. We exploit this type of “quantum quench” to probe the presence of Majorana fermions at the ends of a topological superconducting wire. We obtain an algebraically decaying overlap (Loschmidt echo L(t=|⟨ψ(0|ψ(t⟩|^{2}∼t^{-α} for large times after the quench, with a universal critical exponent α=1/4 that is found to be remarkably robust against details of the setup, such as interactions in the normal lead, the existence of additional lead channels, or the presence of bound levels between the lead and the superconductor. As in recent quantum-dot experiments, this exponent could be measured by optical absorption, offering a new signature of Majorana zero modes that is distinct from interferometry and tunneling spectroscopy.

  10. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Science.gov (United States)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  11. CPN Tools-Assisted Simulation and Verification of Nested Petri Nets

    Directory of Open Access Journals (Sweden)

    L. W. Dworza´nski

    2012-01-01

    Full Text Available Nested Petri nets (NP-nets are an extension of Petri net formalism within the “netswithin-nets” approach, when tokens in a marking are Petri nets, which have an autonomous behavior and are synchronized with the system net. The formalism of NP-nets allows modeling multi-level multi-agent systems with dynamic structure in a natural way. Currently, there is no tool for supporting NP-nets simulation and analysis. The paper proposes the translation of NP-nets into Colored Petri nets and the use of CPN Tools as a virtual machine for NP-nets modeling, simulation and automatic verification.

  12. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  13. Verificaciónn de firma y gráficos manuscritos: Características discriminantes y nuevos escenarios de aplicación biométrica

    OpenAIRE

    Martínez Díaz, Marcos

    2016-01-01

    Tesis doctoral inédita leída en la Escuela Politécnica Superior, Departamento de Tecnología Electrónica y de las Comunicaciones. Fecha de lectura: Febrero 2015 The proliferation of handheld devices such as smartphones and tablets brings a new scenario for biometric authentication, and in particular to automatic signature verification. Research on signature verification has been traditionally carried out using signatures acquired on digitizing tablets or Tablet-PCs. This PhD Th...

  14. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  15. 17 CFR 201.65 - Identity and signature.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Identity and signature. 201.65... of 1934 § 201.65 Identity and signature. Applications pursuant to this subpart may omit the identity, mailing address, and signature of the applicant; provided, that such identity, mailing address and...

  16. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  17. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  18. 15 CFR 908.16 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Signature. 908.16 Section 908.16 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National...

  19. Fast film dosimetry calibration method for IMRT treatment plan verification

    International Nuclear Information System (INIS)

    Schwob, N.; Wygoda, A.

    2004-01-01

    Intensity-Modulated Radiation Therapy (IMRT) treatments are delivered dynamically and as so, require routinely performed verification measurements [1]. Radiographic film dosimetry is a well-adapted method for integral measurements of dynamic treatments fields, with some drawbacks related to the known problems of dose calibration of films. Classically, several films are exposed to increasing doses, and a Net Optical Density (N.O.D) vs. dose sensitometric curve (S.C.) is generated. In order to speed up the process, some authors have developed a method based on the irradiation of a single film with a non-uniform pattern of O.D., delivered with a dynamic MLC. However, this curve still needs to be calibrated to dose by the means of measurements in a water phantom. It is recommended to make a new calibration for every series of measurements, in order to avoid the processing quality dependence of the film response. These frequent measurements are very time consuming. We developed a simple method for quick dose calibration of films, including a check of the accuracy of the calibration curve obtained

  20. 34 CFR 101.32 - Signature of documents.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Signature of documents. 101.32 Section 101.32 Education Regulations of the Offices of the Department of Education OFFICE FOR CIVIL RIGHTS, DEPARTMENT OF EDUCATION... Documents § 101.32 Signature of documents. The signature of a party, authorized officer, employee or...

  1. 29 CFR 102.116 - Signature of orders.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Signature of orders. 102.116 Section 102.116 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Certification and Signature of Documents § 102.116 Signature of orders. The executive secretary or the associate executive...

  2. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  3. On signature change in p-adic space-times

    International Nuclear Information System (INIS)

    Dragovic, B.G.

    1991-01-01

    Change of signature by linear coordinate transformations in p-adic space-times is considered. In this paper it is shown that there exists arbitrary change of trivial signature in Q p n for all n ≥ 1 if p ≡ 1 (mod 4). In other cases it is possible to change only even number of the signs of the signature. The authors suggest new concept of signature with respect to distinct quadratic extensions, of Q p . If space-time dimension is restricted to four there is no signature change

  4. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  5. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  6. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  7. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  8. Public Auditing and Data Dynamics in Cloud with Performance Assessment on Third Party Auditor

    DEFF Research Database (Denmark)

    P. Sawant, Snehal; Deshmukh, Aaradhana A.; Mihovska, Albena Dimitrova

    2016-01-01

    presented in this paper uses the concept of an external Third Party Auditor (TPA). TPA is an external party who is going to perform integrity verification of the user’s data on behalf of the user. The proposed scheme assures integrity verification with a dynamic data support, to ensure that changes made...

  9. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  10. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  11. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  12. The prognostic value of temporal in vitro and in vivo derived hypoxia gene-expression signatures in breast cancer

    International Nuclear Information System (INIS)

    Starmans, Maud H.W.; Chu, Kenneth C.; Haider, Syed; Nguyen, Francis; Seigneuric, Renaud; Magagnin, Michael G.; Koritzinsky, Marianne; Kasprzyk, Arek; Boutros, Paul C.; Wouters, Bradly G.

    2012-01-01

    Background and purpose: Recent data suggest that in vitro and in vivo derived hypoxia gene-expression signatures have prognostic power in breast and possibly other cancers. However, both tumour hypoxia and the biological adaptation to this stress are highly dynamic. Assessment of time-dependent gene-expression changes in response to hypoxia may thus provide additional biological insights and assist in predicting the impact of hypoxia on patient prognosis. Materials and methods: Transcriptome profiling was performed for three cell lines derived from diverse tumour-types after hypoxic exposure at eight time-points, which include a normoxic time-point. Time-dependent sets of co-regulated genes were identified from these data. Subsequently, gene ontology (GO) and pathway analyses were performed. The prognostic power of these novel signatures was assessed in parallel with previous in vitro and in vivo derived hypoxia signatures in a large breast cancer microarray meta-dataset (n = 2312). Results: We identified seven recurrent temporal and two general hypoxia signatures. GO and pathway analyses revealed regulation of both common and unique underlying biological processes within these signatures. None of the new or previously published in vitro signatures consisting of hypoxia-induced genes were prognostic in the large breast cancer dataset. In contrast, signatures of repressed genes, as well as the in vivo derived signatures of hypoxia-induced genes showed clear prognostic power. Conclusions: Only a subset of hypoxia-induced genes in vitro demonstrates prognostic value when evaluated in a large clinical dataset. Despite clear evidence of temporal patterns of gene-expression in vitro, the subset of prognostic hypoxia regulated genes cannot be identified based on temporal pattern alone. In vivo derived signatures appear to identify the prognostic hypoxia induced genes. The prognostic value of hypoxia-repressed genes is likely a surrogate for the known importance of

  13. 45 CFR 81.32 - Signature of documents.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Signature of documents. 81.32 Section 81.32 Public... UNDER PART 80 OF THIS TITLE Form, Execution, Service and Filing of Documents § 81.32 Signature of documents. The signature of a party, authorized officer, employee or attorney constitutes a certificate that...

  14. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  15. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  16. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  17. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  18. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  19. Signature molecular descriptor : advanced applications.

    Energy Technology Data Exchange (ETDEWEB)

    Visco, Donald Patrick, Jr. (Tennessee Technological University, Cookeville, TN)

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  20. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  1. Ship Signature Management System : Functionality

    NARCIS (Netherlands)

    Arciszewski, H.F.R.; Lier, L. van; Meijer, Y.G.S.; Noordkamp, H.W.; Wassenaar, A.S.

    2010-01-01

    A signature of a platform is the manner in which the platform manifests itself to a certain type of sensor and how observable it is when such a sensor is used to detect the platform. Because many military platforms use sensors in different media, it is the total of its different signatures that

  2. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  3. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  4. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  5. Radiochromic film in the dosimetric verification of intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Zhou Yingjuan; Huang Shaomin; Deng Xiaowu

    2007-01-01

    Objective: Objective To investigate the dose-response behavior of a new type of radio- chromic film( GAFCHROMIC EBT) and explore the clinical application means and precision of dosage measurement, which can be applied for: (1) plan-specific dosimetric verification for intensity modulated radiation therapy, (2) to simplify the process of quality assurance using traditional radiographic film dosimetric system and (3) to establish a more reliable, more efficient dosimetric verification system for intensity modulated radiation therapy. Methods: (1) The step wedge calibration technique was used to calibrate EBT radiochromic film and EDR2 radiographic film. The dose characteristics, the measurement consistency and the quality assurance process between the two methods were compared. (2) The in-phantom dose-measurement based verification technique has been adopted. Respectively, EBT film and EDR2 film were used to measure the same dose plane of IMRT treatment plans. The results of the dose map, dose profiles and iso- dose curves were compared with those calculated by CORVUS treatment planning system to evaluate the function of EBT film for dosimetric verification for intensity modulated radiation therapy. Results: (1) Over the external beam dosimetric range of 0-500 cGy, EBT/VXR-16 and EDR2/VXR-16 film dosimetric system had the same measurement consistency with the measurement variability less then 0.70%. The mean measurement variability of these two systems was 0.37% and 0.68%, respectively. The former proved to be the superior modality at measurement consistency, reliability, and efficiency over dynamic clinical dose range , furthermore, its quality assurance showed less process than the latter. (2) The dosimetric verification of IMRT plane measured with EBT film was quite similar to that with EDR2 film which was processed under strict quality control. In a plane of the phantom, the maximal dose deviation off axis between EBT film measurement and the TPS calculation was

  6. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  7. Cell short circuit, preshort signature

    Science.gov (United States)

    Lurie, C.

    1980-01-01

    Short-circuit events observed in ground test simulations of DSCS-3 battery in-orbit operations are analyzed. Voltage signatures appearing in the data preceding the short-circuit event are evaluated. The ground test simulation is briefly described along with performance during reconditioning discharges. Results suggest that a characteristic signature develops prior to a shorting event.

  8. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  9. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  10. EPID-based verification of the MLC performance for dynamic IMRT and VMAT

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; Barnes, Michael P.; O’Connor, Daryl J.; Greer, Peter B.

    2012-01-01

    Purpose: In advanced radiotherapy treatments such as intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), verification of the performance of the multileaf collimator (MLC) is an essential part of the linac QA program. The purpose of this study is to use the existing measurement methods for geometric QA of the MLCs and extend them to more comprehensive evaluation techniques, and to develop dedicated robust algorithms to quantitatively investigate the MLC performance in a fast, accurate, and efficient manner. Methods: The behavior of leaves was investigated in the step-and-shoot mode by the analysis of integrated electronic portal imaging device (EPID) images acquired during picket fence tests at fixed gantry angles and arc delivery. The MLC was also studied in dynamic mode by the analysis of cine EPID images of a sliding gap pattern delivered in a variety of conditions including different leaf speeds, deliveries at fixed gantry angles or in arc mode, and changing the direction of leaf motion. The accuracy of the method was tested by detection of the intentionally inserted errors in the delivery patterns. Results: The algorithm developed for the picket fence analysis was able to find each individual leaf position, gap width, and leaf bank skewness in addition to the deviations from expected leaf positions with respect to the beam central axis with sub-pixel accuracy. For the three tested linacs over a period of 5 months, the maximum change in the gap width was 0.5 mm, the maximum deviation from the expected leaf positions was 0.1 mm and the MLC skewness was up to 0.2°. The algorithm developed for the sliding gap analysis could determine the velocity and acceleration/deceleration of each individual leaf as well as the gap width. There was a slight decrease in the accuracy of leaf performance with increasing leaf speeds. The analysis results were presented through several graphs. The accuracy of the method was assessed as 0.01 mm

  11. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  12. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  13. Interpreting stream sediment fingerprints against primary and secondary source signatures in agricultural catchments

    Science.gov (United States)

    Blake, Will H.; Haley, Steve; Smith, Hugh G.; Taylor, Alex; Goddard, Rupert; Lewin, Sean; Fraser, David

    2013-04-01

    sediment, the potential role of dissolved metal leaching and subsequent sediment-water interaction within the channel on signature modification remained unclear. Consideration of sediment signature modification en route from primary source to stream elucidated important information regarding sediment transfer pathways and dynamics relevant to sediment management decisions. Further work on sediment-water interactions and potential for signature transformation in the channel environment is required.

  14. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  15. COMPUTER-IMPLEMENTED METHOD OF PERFORMING A SEARCH USING SIGNATURES

    DEFF Research Database (Denmark)

    2017-01-01

    A computer-implemented method of processing a query vector and a data vector), comprising: generating a set of masks and a first set of multiple signatures and a second set of multiple signatures by applying the set of masks to the query vector and the data vector, respectively, and generating...... candidate pairs, of a first signature and a second signature, by identifying matches of a first signature and a second signature. The set of masks comprises a configuration of the elements that is a Hadamard code; a permutation of a Hadamard code; or a code that deviates from a Hadamard code...

  16. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  17. The Los Alamos Science Pillars The Science of Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Joshua E. [Los Alamos National Laboratory; Peterson, Eugene J. [Los Alamos National Laboratory

    2012-09-13

    imagination of many LANL staff and managers and resulted in a strategy which focuses on our strengths while recognizing that the science of signatures is dynamic. This report highlights the interdependence between SoS, advances in materials science, and advances in information technology. The intent is that SoS shape and inform Los Alamos investments in nuclear forensics, nuclear diagnostics, climate, space, energy, and biosurveillence; the areas of leadership that you will read about in this strategy document. The Science of Signatures is still a relatively new strategic direction for the Laboratory. The primary purpose of this document is tell Laboratory staff how SoS is being managed and give them a chance to get involved. A second important purpose is to inform the Department of Energy and our customers of our capability growth in this important scientific area. Questions concerning the SoS strategy and input to it are welcomed and may be directed to any member of the SoS Leadership Council or to the Chemistry, Life, and Earth Science Directorate Office.

  18. Signature effects in 2-qp rotational bands

    International Nuclear Information System (INIS)

    Jain, A.K.; Goel, A.

    1992-01-01

    The authors briefly review the progress in understanding the 2-qp rotational bands in odd-odd nuclei. Signature effects and the phenomenon of signature inversion are discussed. The Coriolis coupling appears to have all the ingredients to explain the inversion. Some recent work on signature dependence in 2-qp bands of even-even nuclei is also discussed; interesting features are pointed out

  19. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  20. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Min, Chul Hee [Department of Radiological Science, College of Health Science, Yonsei University, Wonju, Kangwon (Korea, Republic of); Testa, Mauro; Winey, Brian [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); Normandin, Marc D. [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); El Fakhri, Georges, E-mail: elfakhri@pet.mgh.harvard.edu [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States)

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  1. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  2. Time-dependent delayed signatures from energetic photon interrogations

    International Nuclear Information System (INIS)

    Norman, Daren R.; Jones, James L.; Blackburn, Brandon W.; Haskell, Kevin J.; Johnson, James T.; Watson, Scott M.; Hunt, Alan W.; Spaulding, Randy; Harmon, Frank

    2007-01-01

    Pulsed photonuclear interrogation environments generated by 8-24 MeV electron linac are rich with time-dependent, material-specific, radiation signatures. Nitrogen-based explosives and nuclear materials can be detected by exploiting these signatures in different delayed-time regions. Numerical and experimental results presented in this paper show the unique time and energy dependence of these signatures. It is shown that appropriate delayed-time windows are essential to acquire material-specific signatures in pulsed photonuclear assessment environments. These developments demonstrate that pulsed, high-energy, photon-inspection environments can be exploited for time-dependent, material-specific signatures through the proper operation of specialized detectors and detection methods

  3. Peripheral blood signatures of lead exposure.

    Directory of Open Access Journals (Sweden)

    Heather G LaBreche

    Full Text Available BACKGROUND: Current evidence indicates that even low-level lead (Pb exposure can have detrimental effects, especially in children. We tested the hypothesis that Pb exposure alters gene expression patterns in peripheral blood cells and that these changes reflect dose-specific alterations in the activity of particular pathways. METHODOLOGY/PRINCIPAL FINDING: Using Affymetrix Mouse Genome 430 2.0 arrays, we examined gene expression changes in the peripheral blood of female Balb/c mice following exposure to per os lead acetate trihydrate or plain drinking water for two weeks and after a two-week recovery period. Data sets were RMA-normalized and dose-specific signatures were generated using established methods of supervised classification and binary regression. Pathway activity was analyzed using the ScoreSignatures module from GenePattern. CONCLUSIONS/SIGNIFICANCE: The low-level Pb signature was 93% sensitive and 100% specific in classifying samples a leave-one-out crossvalidation. The high-level Pb signature demonstrated 100% sensitivity and specificity in the leave-one-out crossvalidation. These two signatures exhibited dose-specificity in their ability to predict Pb exposure and had little overlap in terms of constituent genes. The signatures also seemed to reflect current levels of Pb exposure rather than past exposure. Finally, the two doses showed differential activation of cellular pathways. Low-level Pb exposure increased activity of the interferon-gamma pathway, whereas high-level Pb exposure increased activity of the E2F1 pathway.

  4. Magnetic signature surveillance of nuclear fuel

    International Nuclear Information System (INIS)

    Bernatowicz, H.; Schoenig, F.C.

    1981-01-01

    Typical nuclear fuel material contains tramp ferromagnetic particles of random size and distribution. Also, selected amounts of paramagnetic or ferromagnetic material can be added at random or at known positions in the fuel material. The fuel material in its non-magnetic container is scanned along its length by magnetic susceptibility detecting apparatus whereby susceptibility changes along its length are obtained and provide a unique signal waveform of the container of fuel material as a signature thereof. The output signature is stored. At subsequent times in its life the container is again scanned and respective signatures obtained which are compared with the initially obtained signature, any differences indicating alteration or tampering with the fuel material. If the fuel material includes a paramagnetic additive by taking two measurements along the container the effects thereof can be cancelled out. (author)

  5. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  6. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  7. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    Science.gov (United States)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  8. An Arbitrated Quantum Signature Scheme without Entanglement*

    International Nuclear Information System (INIS)

    Li Hui-Ran; Luo Ming-Xing; Peng Dai-Yuan; Wang Xiao-Jun

    2017-01-01

    Several quantum signature schemes are recently proposed to realize secure signatures of quantum or classical messages. Arbitrated quantum signature as one nontrivial scheme has attracted great interests because of its usefulness and efficiency. Unfortunately, previous schemes cannot against Trojan horse attack and DoS attack and lack of the unforgeability and the non-repudiation. In this paper, we propose an improved arbitrated quantum signature to address these secure issues with the honesty arbitrator. Our scheme takes use of qubit states not entanglements. More importantly, the qubit scheme can achieve the unforgeability and the non-repudiation. Our scheme is also secure for other known quantum attacks . (paper)

  9. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  10. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  11. Neutral signature Walker-VSI metrics

    International Nuclear Information System (INIS)

    Coley, A; McNutt, D; Musoke, N; Brooks, D; Hervik, S

    2014-01-01

    We will construct explicit examples of four-dimensional neutral signature Walker (but not necessarily degenerate Kundt) spaces for which all of the polynomial scalar curvature invariants vanish. We then investigate the properties of some particular subclasses of Ricci flat spaces. We also briefly describe some four-dimensional neutral signature Einstein spaces for which all of the polynomial scalar curvature invariants are constant. (paper)

  12. An interpretation of signature inversion

    International Nuclear Information System (INIS)

    Onishi, Naoki; Tajima, Naoki

    1988-01-01

    An interpretation in terms of the cranking model is presented to explain why signature inversion occurs for positive γ of the axially asymmetric deformation parameter and emerges into specific orbitals. By introducing a continuous variable, the eigenvalue equation can be reduced to a one dimensional Schroedinger equation by means of which one can easily understand the cause of signature inversion. (author)

  13. Acoustic Signature Monitoring and Management of Naval Platforms

    NARCIS (Netherlands)

    Basten, T.G.H.; Jong, C.A.F. de; Graafland, F.; Hof, J. van 't

    2015-01-01

    Acoustic signatures make naval platforms susceptible to detection by threat sensors. The variable operational conditions and lifespan of a platform cause variations in the acoustic signature. To deal with these variations, a real time signature monitoring capability is being developed, with advisory

  14. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  15. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  16. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  17. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  18. Verification and implementation of microburst day potential index (MDPI) and wind INDEX (WINDEX) forecasting tools at Cape Canaveral Air Station

    Science.gov (United States)

    Wheeler, Mark

    1996-01-01

    This report details the research, development, utility, verification and transition on wet microburst forecasting and detection the Applied Meteorology Unit (AMU) did in support of ground and launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). The unforecasted wind event on 16 August 1994 of 33.5 ms-1 (65 knots) at the Shuttle Landing Facility raised the issue of wet microburst detection and forecasting. The AMU researched and analyzed the downburst wind event and determined it was a wet microburst event. A program was developed for operational use on the Meteorological Interactive Data Display System (MIDDS) weather system to analyze, compute and display Theta(epsilon) profiles, the microburst day potential index (MDPI), and wind index (WINDEX) maximum wind gust value. Key microburst nowcasting signatures using the WSR-88D data were highlighted. Verification of the data sets indicated that the MDPI has good potential in alerting the duty forecaster to the potential of wet microburst and the WINDEX values computed from the hourly surface data do have potential in showing a trend for the maximum gust potential. WINDEX should help in filling in the temporal hole between the MDPI on the last Cape Canaveral rawinsonde and the nowcasting radar data tools.

  19. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  20. Targeted Metabolomics Reveals Early Dominant Optic Atrophy Signature in Optic Nerves of Opa1delTTAG/+ Mice.

    Science.gov (United States)

    Chao de la Barca, Juan Manuel; Simard, Gilles; Sarzi, Emmanuelle; Chaumette, Tanguy; Rousseau, Guillaume; Chupin, Stéphanie; Gadras, Cédric; Tessier, Lydie; Ferré, Marc; Chevrollier, Arnaud; Desquiret-Dumas, Valérie; Gueguen, Naïg; Leruez, Stéphanie; Verny, Christophe; Miléa, Dan; Bonneau, Dominique; Amati-Bonneau, Patrizia; Procaccio, Vincent; Hamel, Christian; Lenaers, Guy; Reynier, Pascal; Prunier-Mirebeau, Delphine

    2017-02-01

    Dominant optic atrophy (MIM No. 165500) is a blinding condition related to mutations in OPA1, a gene encoding a large GTPase involved in mitochondrial inner membrane dynamics. Although several mouse models mimicking the disease have been developed, the pathophysiological mechanisms responsible for retinal ganglion cell degeneration remain poorly understood. Using a targeted metabolomic approach, we measured the concentrations of 188 metabolites in nine tissues, that is, brain, three types of skeletal muscle, heart, liver, retina, optic nerve, and plasma in symptomatic 11-month-old Opa1delTTAG/+ mice. Significant metabolic signatures were found only in the optic nerve and plasma of female mice. The optic nerve signature was characterized by altered concentrations of phospholipids, amino acids, acylcarnitines, and carnosine, whereas the plasma signature showed decreased concentrations of amino acids and sarcosine associated with increased concentrations of several phospholipids. In contrast, the investigation of 3-month-old presymptomatic Opa1delTTAG/+ mice showed no specific plasma signature but revealed a significant optic nerve signature in both sexes, although with a sex effect. The Opa1delTTAG/+ versus wild-type optic nerve signature was characterized by the decreased concentrations of 10 sphingomyelins and 10 lysophosphatidylcholines, suggestive of myelin sheath alteration, and by alteration in the concentrations of metabolites involved in neuroprotection, such as dimethylarginine, carnitine, spermine, spermidine, carnosine, and glutamate, suggesting a concomitant axonal metabolic dysfunction. Our comprehensive metabolomic investigations revealed in symptomatic as well as in presymptomatic Opa1delTTAG/+ mice, a specific sensitiveness of the optic nerve to Opa1 insufficiency, opening new routes for protective therapeutic strategies.

  1. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  2. 48 CFR 4.102 - Contractor's signature.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor's signature. 4.102 Section 4.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with an...

  3. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  4. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  5. DIGITAL SIGNATURE IN THE WAY OF LAW

    Directory of Open Access Journals (Sweden)

    Ruya Samlı

    2013-01-01

    Full Text Available Signature can be defined as a person’s name or special signs that he/she writes when he/she wants to indicate he/she wrote or confirm that writing. A person signs many times in his/her life. A person’s signature that is used for thousands of times for many things from formal documents to exams has importance for that person. Especially, signing in legal operations is an operation that can build important results. If a person’s signature is imitated by another person, he/she can become beholden, donate his/her whole wealth, commits offences or do some judicial operations. Today, because many operations can be done with digital environments and internet, signature operation that provides identity validation must also be carried to digital environment. In this paper digital signature concept that is approved for this reason and its situation in international areas and Turkish laws are investigated.

  6. Starry messages: Searching for signatures of interstellar archaeology

    Energy Technology Data Exchange (ETDEWEB)

    Carrigan, Richard A., Jr.; /Fermilab

    2009-12-01

    Searching for signatures of cosmic-scale archaeological artifacts such as Dyson spheres or Kardashev civilizations is an interesting alternative to conventional SETI. Uncovering such an artifact does not require the intentional transmission of a signal on the part of the original civilization. This type of search is called interstellar archaeology or sometimes cosmic archaeology. The detection of intelligence elsewhere in the Universe with interstellar archaeology or SETI would have broad implications for science. For example, the constraints of the anthropic principle would have to be loosened if a different type of intelligence was discovered elsewhere. A variety of interstellar archaeology signatures are discussed including non-natural planetary atmospheric constituents, stellar doping with isotopes of nuclear wastes, Dyson spheres, as well as signatures of stellar and galactic-scale engineering. The concept of a Fermi bubble due to interstellar migration is introduced in the discussion of galactic signatures. These potential interstellar archaeological signatures are classified using the Kardashev scale. A modified Drake equation is used to evaluate the relative challenges of finding various sources. With few exceptions interstellar archaeological signatures are clouded and beyond current technological capabilities. However SETI for so-called cultural transmissions and planetary atmosphere signatures are within reach.

  7. Signature Pedagogies in Support of Teachers' Professional Learning

    Science.gov (United States)

    Parker, Melissa; Patton, Kevin; O'Sullivan, Mary

    2016-01-01

    Signature pedagogies [Shulman, L. 2005. "Signature pedagogies in the professions." "Daedalus" 134 (3): 52--59.] are a focus of teacher educators seeking to improve teaching and teacher education. The purpose of this paper is to present a preliminary common language of signature pedagogies for teacher professional development…

  8. Characteristics and Validation Techniques for PCA-Based Gene-Expression Signatures

    Directory of Open Access Journals (Sweden)

    Anders E. Berglund

    2017-01-01

    Full Text Available Background. Many gene-expression signatures exist for describing the biological state of profiled tumors. Principal Component Analysis (PCA can be used to summarize a gene signature into a single score. Our hypothesis is that gene signatures can be validated when applied to new datasets, using inherent properties of PCA. Results. This validation is based on four key concepts. Coherence: elements of a gene signature should be correlated beyond chance. Uniqueness: the general direction of the data being examined can drive most of the observed signal. Robustness: if a gene signature is designed to measure a single biological effect, then this signal should be sufficiently strong and distinct compared to other signals within the signature. Transferability: the derived PCA gene signature score should describe the same biology in the target dataset as it does in the training dataset. Conclusions. The proposed validation procedure ensures that PCA-based gene signatures perform as expected when applied to datasets other than those that the signatures were trained upon. Complex signatures, describing multiple independent biological components, are also easily identified.

  9. Many-body localization dynamics from a one-particle perspective

    Energy Technology Data Exchange (ETDEWEB)

    Lezama Mergold Love, Talia; Bera, Soumya; Bardarson, Jens Hjorleifur [Max Planck Institute for the Physics of Complex Systems, Dresden (Germany)

    2016-07-01

    Systems exhibiting many-body localization (Anderson insulators in the presence of interactions) present a novel class of nonergodic phases of matter. The study of entanglement, in terms of both exact eigenstates and its time evolution after quenches, has been useful to reveal the salient signatures of these systems. Similarly to the entanglement entropy of exact eigenstates, the one-particle density matrix can be used as a tool to characterize the many-body localization transition with its eigenvalues showing a Fermi-liquid like step discontinuity in the localized phase. However, this analysis distinguishes the Fock-space structure of the eigenstates from the real space. Here, we present numerical evidence for dynamical signatures of the many-body localized phase for a closed fermionic system, using the one-particle density matrix and its time evolution after a global quench. We discuss and compare our results with the well-known logarithmic spreading of entanglement (a dynamical signature of this phase, absent in the Anderson insulator).

  10. Machine learning techniques for the verification of refueling activities in CANDU-type nuclear power plants (NPPs) with direct applications in nuclear safeguards

    International Nuclear Information System (INIS)

    Budzinski, J.

    2006-06-01

    the Viterbi algorithm is applied for the state sequence decoding, which allows for physical constraints derived from simulation models to be efficiently incorporated into the overall recognition scheme by appropriately rescoring the resulting n-best hypothesis lists. The physical constraints are derived from the local balance equation for precursors produced in fuel during discharge and the corresponding log-likelihood scores on the physical model are given by the negative chi-squared between the model predictions and data measurements. The proposed hybrid recognition algorithm is finally implemented in an automated fuel-handling verification system (AVER), the parameters of which are then optimized for maximum performance. This thesis also gives a dynamic, functional validation framework for such verification systems in nuclear safeguards, including the developed AVER system. The system's conclusions on random test data from both historical and synthesized data sets are compared against the relevant expert knowledge, and various metrics and risk measures are computed to judge the system's performance and reliability of its conclusions. The validation tests have shown that the developed verification system meets the desired competency requirements on its entire prespecified input domain of CANDU-6 data. Results on simulated data have also demonstrated the ability of the proposed system to detect radiation signatures corresponding to abnormal and rare events that normally do not appear in data. Throughout all the tests, AVER greatly outperformed an existing rudimentary verification system, consistently producing a reduction in the misclassification rate of about 170 %. The expected risk of undetected fuel discharge was estimated to be a hundred thousand times less than one significant quantity (SQ) of irradiated direct-use material (IDU) per unit per year, which is much below the acceptable limits and hence may be subject to no or only less intrusive safeguards

  11. Infrared ship signature analysis and optimisation

    NARCIS (Netherlands)

    Neele, F.P.

    2005-01-01

    The last decade has seen an increase in the awareness of the infrared signature of naval ships. New ship designs show that infrared signature reduction measures are being incorporated, such as exhaust gas cooling systems, relocation of the exhausts and surface cooling systems. Hull and

  12. Dynamic shape transitions in the sdg boson model

    Science.gov (United States)

    Kuyucak, S.

    The dynamic evolution of shapes in the sdg interacting boson model is investigated using the angular momentum projected mean field theory. Deformed nuclei are found to be quite stable against shape changes but transitional nuclei could exhibit dynamic shape transitions in the region L = 10-20. Conditions of existence and experimental signatures for dynamic shape transitions are discussed together with a likely candidate, 192Os.

  13. Dynamic shape transitions in the sdg boson model

    Energy Technology Data Exchange (ETDEWEB)

    Kuyucak, S. (Melbourne Univ., Parkville (Australia). School of Physics)

    1992-01-01

    The dynamic evolution of shapes in the sdg interacting bosun model is investigated using the angular momentum projected mean field theory. Deformed nuclei are found to be quite stable against shape changes but transitional nuclei could exhibit dynamic shape transitions in the region L = 10-20. Conditions of existence and experimental signatures for dynamic shape transitions are discussed together with a likely candidate, {sup 192}Os. (author).

  14. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  15. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  16. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  17. Application of Dynamic Analysis in Semi-Analytical Finite Element Method.

    Science.gov (United States)

    Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus

    2017-08-30

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.

  18. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  19. Regression testing Ajax applications : Coping with dynamism

    NARCIS (Netherlands)

    Roest, D.; Mesbah, A.; Van Deursen, A.

    2009-01-01

    Note: This paper is a pre-print of: Danny Roest, Ali Mesbah and Arie van Deursen. Regression Testing AJAX Applications: Coping with Dynamism. In Proceedings of the 3rd International Conference on Software Testing, Verification and Validation (ICST’10), Paris, France. IEEE Computer Society, 2010.

  20. Fine-Grained Forward-Secure Signature Schemes without Random Oracles

    DEFF Research Database (Denmark)

    Camenisch, Jan; Koprowski, Maciej

    2006-01-01

    We propose the concept of fine-grained forward-secure signature schemes. Such signature schemes not only provide nonrepudiation w.r.t. past time periods the way ordinary forward-secure signature schemes do but, in addition, allow the signer to specify which signatures of the current time period...... remain valid when revoking the public key. This is an important advantage if the signer produces many signatures per time period as otherwise the signer would have to re-issue those signatures (and possibly re-negotiate the respective messages) with a new key.Apart from a formal model for fine......-grained forward-secure signature schemes, we present practical schemes and prove them secure under the strong RSA assumption only, i.e., we do not resort to the random oracle model to prove security. As a side-result, we provide an ordinary forward-secure scheme whose key-update time is significantly smaller than...

  1. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  2. Practical Formal Verification of MPI and Thread Programs

    Science.gov (United States)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  3. Thermodynamic signatures of fragment binding: Validation of direct versus displacement ITC titrations.

    Science.gov (United States)

    Rühmann, Eggert; Betz, Michael; Fricke, Marie; Heine, Andreas; Schäfer, Martina; Klebe, Gerhard

    2015-04-01

    Detailed characterization of the thermodynamic signature of weak binding fragments to proteins is essential to support the decision making process which fragments to take further for the hit-to-lead optimization. Isothermal titration calorimetry (ITC) is the method of choice to record thermodynamic data, however, weak binding ligands such as fragments require the development of meaningful and reliable measuring protocols as usually sigmoidal titration curves are hardly possible to record due to limited solubility. Fragments can be titrated either directly under low c-value conditions (no sigmoidal curve) or indirectly by use of a strong binding ligand displacing the pre-incubated weak fragment from the protein. The determination of Gibbs free energy is reliable and rather independent of the applied titration protocol. Even though the displacement method achieves higher accuracy, the obtained enthalpy-entropy profile depends on the properties of the used displacement ligand. The relative enthalpy differences across different displacement experiments reveal a constant signature and can serve as a thermodynamic fingerprint for fragments. Low c-value titrations are only reliable if the final concentration of the fragment in the sample cell exceeds 2-10 fold its K(D) value. Limited solubility often prevents this strategy. The present study suggests an applicable protocol to characterize the thermodynamic signature of protein-fragment binding. It shows however, that such measurements are limited by protein and fragment solubility. Deviating profiles obtained by use of different displacement ligands indicate that changes in the solvation pattern and protein dynamics most likely take influence on the resulting overall binding signature. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Studying the potential of point detectors in time-resolved dose verification of dynamic radiotherapy

    International Nuclear Information System (INIS)

    Beierholm, A.R.; Behrens, C.F.; Andersen, C.E.

    2015-01-01

    Modern megavoltage x-ray radiotherapy with high spatial and temporal dose gradients puts high demands on the entire delivery system, including not just the linear accelerator and the multi-leaf collimator, but also algorithms used for optimization and dose calculations, and detectors used for quality assurance and dose verification. In this context, traceable in-phantom dosimetry using a well-characterized point detector is often an important supplement to 2D-based quality assurance methods based on radiochromic film or detector arrays. In this study, an in-house developed dosimetry system based on fiber-coupled plastic scintillator detectors was evaluated and compared with a Farmer-type ionization chamber and a small-volume ionization chamber. An important feature of scintillator detectors is that the sensitive volume of the detector can easily be scaled, and five scintillator detectors of different scintillator length were thus employed to quantify volume averaging effects by direct measurement. The dosimetric evaluation comprised several complex-shape static fields as well as simplified dynamic deliveries using RapidArc, a volumetric-modulated arc therapy modality often used at the participating clinic. The static field experiments showed that the smallest scintillator detectors were in the best agreement with dose calculations, while needing the smallest volume averaging corrections. Concerning total dose measured during RapidArc, all detectors agreed with dose calculations within 1.1 ± 0.7% when positioned in regions of high homogenous dose. Larger differences were observed for high dose gradient and organ at risk locations, were differences between measured and calculated dose were as large as 8.0 ± 5.5%. The smallest differences were generally seen for the small-volume ionization chamber and the smallest scintillators. The time-resolved RapidArc dose profiles revealed volume-dependent discrepancies between scintillator and ionization chamber response

  5. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    Science.gov (United States)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  6. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  7. 21 CFR 1309.32 - Application forms; contents; signature.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Application forms; contents; signature. 1309.32... Application forms; contents; signature. (a) Any person who is required to be registered pursuant to § 1309.21... this paragraph and shall contain the signature of the individual being authorized to sign the...

  8. 38 CFR 18b.21 - Signature of documents.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Signature of documents. 18b.21 Section 18b.21 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Documents § 18b.21 Signature of documents. The signature of a party, authorized officer, employee, or...

  9. Detection of chemical explosives using multiple photon signatures

    International Nuclear Information System (INIS)

    Loschke, K.W.; Dunn, W.L.

    2008-01-01

    Full text: A template-matching procedure to aid in rapid detection of improvised explosive devices (IEDs) is being investigated. Multiple photon-scattered and photon-induced positron annihilation radiation responses are being used as part of a photon-neutron signature-based radiation scanning (SBRS) approach (see companion reference for description of the neutron component), in an attempt to detect chemical explosives at safe standoff distances. Many past and present photon interrogation methods are based on imaging. Imaging techniques seek to determine at high special resolution the internal structure of a target of interest. Our technique simply seeks to determine if an unknown target contains a detectable amount of chemical explosives by comparing multiple responses (signatures) that depend on both density and composition of portions of a target. In the photon component, beams of photons are used to create back-streaming signatures, which are dependent on the density and composition of part of the target being interrogated. These signatures are compared to templates, which are collections of the same signatures if the interrogated volume contained a significant amount of explosives. The signature analysis produces a figure-of-merit and a standard deviation of the figure-of-merit. These two metrics are used to filter safe from dangerous targets. Experiments have been conducted that show that explosive surrogates (fertilizers) can be distinguished from several inert materials using these photon signatures, demonstrating that these signatures can be used effectively to help IEDs

  10. Co-simulation for real time safety verification of nuclear power plants

    International Nuclear Information System (INIS)

    Boafo, E.K.; Zhang, L.; Nasimi, E.; Gabbar, H.A.

    2015-01-01

    Small and major accidents and near misses are still occurring in nuclear power plants (NPPs). Risk level has increased with the degradation of NPP equipment and instrumentations. In order to achieve NPP safety, it is important to continuously evaluate risk for all potential hazard and fault propagation scenarios and map protection layers to fault / failure / hazard propagation scenarios to be able to evaluate and verify safety level during NPP operation. There are major limitations in current real time safety verification tools, as it is mainly offline and with no integration to NPP simulation tools. The main goal of this research is to develop real time safety verification with co-simulation tool to be integrated with plant operation support systems. This includes the development of static and dynamic fault semantic network (FSN) to model all possible fault propagation scenarios and the interrelationships among associated process variables. Safety and protection layers along with their reliability are mapped to FSN so that safety levels can be verified during plant operation. Errors between multiphysics models and real time data are modeled to accurately and dynamically tune FSN for each fault propagation scenario. The detailed methodology will show how to integrate process models, construction of static FSN with fault propagation scenarios, and evaluation and tuning of dynamic FSN with probabilistic and process variable interaction values. Principle Component Analysis method is used reduce dimensionality and reduce process variables associated with each fault scenario. Then map independent protection layers (IPL) to FSN with estimated reliability measures of each protection layer to accurately verify safety for different operational scenarios. Intelligent algorithms is used with multivariate techniques to accurate define the interrelation among process variables, in terms of signal strength and time delay, using Genetic Programming (GP), which will provide basis

  11. Temporal dynamics of online petitions.

    Science.gov (United States)

    Böttcher, Lucas; Woolley-Meza, Olivia; Brockmann, Dirk

    2017-01-01

    Online petitions are an important avenue for direct political action, yet the dynamics that determine when a petition will be successful are not well understood. Here we analyze the temporal characteristics of online-petition signing behavior in order to identify systematic differences between popular petitions, which receive a high volume of signatures, and unpopular ones. We find that, in line with other temporal characterizations of human activity, the signing process is typically non-Poissonian and non-homogeneous in time. However, this process exhibits anomalously high memory for human activity, possibly indicating that synchronized external influence or contagion play and important role. More interestingly, we find clear differences in the characteristics of the inter-event time distributions depending on the total number of signatures that petitions receive, independently of the total duration of the petitions. Specifically, popular petitions that attract a large volume of signatures exhibit more variance in the distribution of inter-event times than unpopular petitions with only a few signatures, which could be considered an indication that the former are more bursty. However, petitions with large signature volume are less bursty according to measures that consider the time ordering of inter-event times. Our results, therefore, emphasize the importance of accounting for time ordering to characterize human activity.

  12. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  13. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  14. Enhanced arbitrated quantum signature scheme using Bell states

    International Nuclear Information System (INIS)

    Wang Chao; Liu Jian-Wei; Shang Tao

    2014-01-01

    We investigate the existing arbitrated quantum signature schemes as well as their cryptanalysis, including intercept-resend attack and denial-of-service attack. By exploring the loopholes of these schemes, a malicious signatory may successfully disavow signed messages, or the receiver may actively negate the signature from the signatory without being detected. By modifying the existing schemes, we develop counter-measures to these attacks using Bell states. The newly proposed scheme puts forward the security of arbitrated quantum signature. Furthermore, several valuable topics are also presented for further research of the quantum signature scheme

  15. Dynamic characterizers of spatiotemporal intermittency

    OpenAIRE

    Gupte, Neelima; Jabeen, Zahera

    2006-01-01

    Systems of coupled sine circle maps show regimes of spatiotemporally intermittent behaviour with associated scaling exponents which belong to the DP class, as well as regimes of spatially intermittent behaviour (with associated regular dynamical behaviour) which do not belong to the DP class. Both types of behaviour are seen along the bifurcation boundaries of the synchronized solutions, and contribute distinct signatures to the dynamical characterizers of the system, viz. the distribution of...

  16. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  17. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  18. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  19. Quantum signature scheme based on a quantum search algorithm

    International Nuclear Information System (INIS)

    Yoon, Chun Seok; Kang, Min Sung; Lim, Jong In; Yang, Hyung Jin

    2015-01-01

    We present a quantum signature scheme based on a two-qubit quantum search algorithm. For secure transmission of signatures, we use a quantum search algorithm that has not been used in previous quantum signature schemes. A two-step protocol secures the quantum channel, and a trusted center guarantees non-repudiation that is similar to other quantum signature schemes. We discuss the security of our protocol. (paper)

  20. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  1. Combustion Stability Verification for the Thrust Chamber Assembly of J-2X Developmental Engines 10001, 10002, and 10003

    Science.gov (United States)

    Morgan, C. J.; Hulka, J. R.; Casiano, M. J.; Kenny, R. J.; Hinerman, T. D.; Scholten, N.

    2015-01-01

    The J-2X engine, a liquid oxygen/liquid hydrogen propellant rocket engine available for future use on the upper stage of the Space Launch System vehicle, has completed testing of three developmental engines at NASA Stennis Space Center. Twenty-one tests of engine E10001 were conducted from June 2011 through September 2012, thirteen tests of the engine E10002 were conducted from February 2013 through September 2013, and twelve tests of engine E10003 were conducted from November 2013 to April 2014. Verification of combustion stability of the thrust chamber assembly was conducted by perturbing each of the three developmental engines. The primary mechanism for combustion stability verification was examining the response caused by an artificial perturbation (bomb) in the main combustion chamber, i.e., dynamic combustion stability rating. No dynamic instabilities were observed in the TCA, although a few conditions were not bombed. Additional requirements, included to guard against spontaneous instability or rough combustion, were also investigated. Under certain conditions, discrete responses were observed in the dynamic pressure data. The discrete responses were of low amplitude and posed minimal risk to safe engine operability. Rough combustion analyses showed that all three engines met requirements for broad-banded frequency oscillations. Start and shutdown transient chug oscillations were also examined to assess the overall stability characteristics, with no major issues observed.

  2. Magnetotail processes and their ionospheric signatures

    Science.gov (United States)

    Ferdousi, B.; Raeder, J.; Zesta, E.; Murphy, K. R.; Cramer, W. D.

    2017-12-01

    In-situ observations in the magnetotail are sparse and limited to single point measurements. In the ionosphere, on the other hand, there is a broad range of observations, including magnetometers, auroral imagers, and various radars. Since the ionosphere is to some extent a mirror of plasmasheet processes it can be used as a monitor of magnetotail dynamics. Thus, it is of great importance to understand the coupling between the ionosphere and the magnetosphere in order to properly interpret the ionosphere and ground observations in terms of magnetotail dynamics. For this purpose, the global magnetohydrodynamic model OpenGGCM is used to investigate magnetosphere-ionosphere coupling. One of the key processes in magnetotail dynamics are bursty bulk flows (BBFs) which are the major means by which momentum and energy get transferred through the magnetotail and down to the ionosphere. BBFs often manifested in the ionosphere as auroral streamers. This study focuses on mapping such flow bursts from the magnetotail to the ionosphere along the magnetic field lines for three states of the magnetotail: pre-substorm onset through substorm expansion and during steady magnetospheric convection (SMC) following the substorm. We find that the orientation of streamers in the ionosphere differes for different local times, and that, for both tail and ionospheric signatures, activity increases during the SCM configutation compared to the pre-onset and quiet times. We also find that the background convection in the tail impacts the direction and deflection of the BBFs and the subsequent orientation of the auroral streamers in the ionosphere.

  3. Molecular signatures of thyroid follicular neoplasia

    DEFF Research Database (Denmark)

    Borup, R.; Rossing, M.; Henao, Ricardo

    2010-01-01

    The molecular pathways leading to thyroid follicular neoplasia are incompletely understood, and the diagnosis of follicular tumors is a clinical challenge. To provide leads to the pathogenesis and diagnosis of the tumors, we examined the global transcriptome signatures of follicular thyroid...... a mechanism for cancer progression, which is why we exploited the results in order to generate a molecular classifier that could identify 95% of all carcinomas. Validation employing public domain and cross-platform data demonstrated that the signature was robust and could diagnose follicular nodules...... and robust genetic signature for the diagnosis of FA and FC. Endocrine-Related Cancer (2010) 17 691-708...

  4. Motif signatures of transcribed enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios

    2017-09-14

    In mammalian cells, transcribed enhancers (TrEn) play important roles in the initiation of gene expression and maintenance of gene expression levels in spatiotemporal manner. One of the most challenging questions in biology today is how the genomic characteristics of enhancers relate to enhancer activities. This is particularly critical, as several recent studies have linked enhancer sequence motifs to specific functional roles. To date, only a limited number of enhancer sequence characteristics have been investigated, leaving space for exploring the enhancers genomic code in a more systematic way. To address this problem, we developed a novel computational method, TELS, aimed at identifying predictive cell type/tissue specific motif signatures. We used TELS to compile a comprehensive catalog of motif signatures for all known TrEn identified by the FANTOM5 consortium across 112 human primary cells and tissues. Our results confirm that distinct cell type/tissue specific motif signatures characterize TrEn. These signatures allow discriminating successfully a) TrEn from random controls, proxy of non-enhancer activity, and b) cell type/tissue specific TrEn from enhancers expressed and transcribed in different cell types/tissues. TELS codes and datasets are publicly available at http://www.cbrc.kaust.edu.sa/TELS.

  5. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  6. On verification of a theory in dislocation plasticity

    International Nuclear Information System (INIS)

    Ng, D.H.Y.; Lee, L.H.N.

    1981-01-01

    In the past twenty years, many attempts to unify the theories of macroplasticity and microplasticity in polycrystalline materials have been made. Several major approaches have been suggested namely: the geometrical approach, the analytical approach, the phenomenological approach and the internal variables approach. To verify the plasticity theory based on any one of the above models, detail experimental data including microstructural quantities such as dislocation density, dislocation speed, etc. are required. Unfortunately, there were some difficulties in evaluating dislocation speed and dealing with the term 'mobile fraction' of dislocation density. Therefore, an experimental verification of such plasticity theory has not been made. A dislocation velocity equation based on a thermally activated model is used. A set of plastic strain rate equations for polycrystalline materials formulated by analyzing dislocation dynamics in a statistical approach are presented. In order to evaluate the activation free energy, Gibbs' modified tetragonal distortion model is used together with some measurements obtained from electron micrographs. Experimental results on the dynamic yielding and fracture of 6061-T6 aluminum alloy tubings under biaxial loadings obtained by Ng, Delich and Lee are used. In dealing with 'mobile fraction', Gilman's suggestion is adopted. (orig./HP)

  7. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  8. Dynamic shape transitions in the sdg boson model

    International Nuclear Information System (INIS)

    Kuyucak, S.

    1992-01-01

    The dynamic evolution of shapes in the sdg interacting boson model is investigated using the angular momentum projected mean field theory. Deformed nuclei are found to be quite stable against shape changes but transitional nuclei could exhibit dynamic shape transitions in the region L = 10-20. Conditions of existence and experimental signatures for dynamic shape transitions are discussed together with a likely candidate, 192 Os. 13 refs., 3 figs

  9. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  10. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  11. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  12. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  13. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  14. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan,Earthquake

    Directory of Open Access Journals (Sweden)

    Chien-Chih Chen

    2006-01-01

    Full Text Available Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the M 7.3 1999 Chi-Chi, Taiwan, earthquake. We show that a previously proposed forecast method that is based on evaluating changes in seismic intensity on a regional basis is superior to a forecast based only on the magnitude of seismic intensity in the same region. Our results confirm earlier suggestions that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous activation or quiescence, and that signatures of these processes can be detected in seismicity data using appropriate methods.

  15. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    Science.gov (United States)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; Lane, J. Matthew D.

    2018-05-01

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressure gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. These simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.

  16. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  17. Dynamic CFD Simulations of the Supersonic Inflatable Aerodynamic Decelerator (SIAD) Ballistic Range Tests

    Science.gov (United States)

    Brock, Joseph M; Stern, Eric

    2016-01-01

    Dynamic CFD simulations of the SIAD ballistic test model were performed using US3D flow solver. Motivation for performing these simulations is for the purpose of validation and verification of the US3D flow solver as a viable computational tool for predicting dynamic coefficients.

  18. Forgery Detection by Local Correspondence

    National Research Council Canada - National Science Library

    Guo, Jinhong K

    2000-01-01

    .... Currently signatures are verified only informally in many environments, but the rapid development of computer technology has stimulated great interest in research on automated signature verification...

  19. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  20. 12 CFR 269b.731 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Signature. 269b.731 Section 269b.731 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM CHARGES OF UNFAIR LABOR PRACTICES General Rules § 269b.731 Signature. The original of each document filed shall be...