WorldWideScience

Sample records for dynamic signature verification

  1. Piezoelectric sensor pen for dynamic signature verification

    Energy Technology Data Exchange (ETDEWEB)

    EerNisse, E.P.; Land, C.E.; Snelling, J.B.

    1977-01-01

    The concept of using handwriting dynamics for electronic identification is discussed. A piezoelectric sensor pen for obtaining the pen point dynamics during writing is described. Design equations are derived and details of an operating device are presented. Typical output waveforms are shown to demonstrate the operation of the pen and to show the dissimilarities between dynamics of a genuine signature and an attempted forgery.

  2. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  3. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  4. Signature verification with writing posture analysis

    Science.gov (United States)

    Cheng, Hsu-Yung; Yu, Chih-Chang

    2013-07-01

    A video-based handwritten signature verification framework is proposed in this paper. Using a camera as the sensor has the advantage that the entire writing processes can be captured along with the signatures. The main contribution of this work is that writing postures are analyzed to achieve the verification purpose because the writing postures cannot be easily imitated or forged. The proposed system is able to achieve low false rejection rates while maintaining low false acceptance rates for database containing both unskilled and skilled imitation signatures.

  5. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi

    2010-04-01

    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  6. Online Signature Verification on MOBISIG Finger-Drawn Signature Corpus

    Directory of Open Access Journals (Sweden)

    Margit Antal

    2018-01-01

    Full Text Available We present MOBISIG, a pseudosignature dataset containing finger-drawn signatures from 83 users captured with a capacitive touchscreen-based mobile device. The database was captured in three sessions resulting in 45 genuine signatures and 20 skilled forgeries for each user. The database was evaluated by two state-of-the-art methods: a function-based system using local features and a feature-based system using global features. Two types of equal error rate computations are performed: one using a global threshold and the other using user-specific thresholds. The lowest equal error rate was 0.01% against random forgeries and 5.81% against skilled forgeries using user-specific thresholds that were computed a posteriori. However, these equal error rates were significantly raised to 1.68% (random forgeries case and 14.31% (skilled forgeries case using global thresholds. The same evaluation protocol was performed on the DooDB publicly available dataset. Besides verification performance evaluations conducted on the two finger-drawn datasets, we evaluated the quality of the samples and the users of the two datasets using basic quality measures. The results show that finger-drawn signatures can be used by biometric systems with reasonable accuracy.

  7. An Optimized Signature Verification System for Vehicle Ad hoc NETwork

    OpenAIRE

    Mamun, Mohammad Saiful Islam; Miyaji, Atsuko

    2012-01-01

    This paper1 presents an efficient approach to an existing batch verification system on Identity based group signature (IBGS) which can be applied to any Mobile ad hoc network device including Vehicle Ad hoc Networks (VANET). We propose an optimized way to batch signatures in order to get maximum throughput from a device in runtime environment. In addition, we minimize the number of pairing computations in batch verification proposed by B. Qin et al. for large scale VANET. We introduce a batch...

  8. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  9. Neural network signature verification using Haar wavelet and Fourier transforms

    Science.gov (United States)

    McCormack, Daniel K. R.; Brown, B. M.; Pedersen, John F.

    1993-08-01

    This paper discusses the use of neural network's for handwritten signature verification using the Fourier and Haar wavelet transforms as methods of encoding signature images. Results will be presented that discuss a neural network's ability to generalize to unseen signatures using wavelet encoded training data. These results will be discussed with reference to both Backpropagation networks and Cascade-Correlation networks. Backpropagation and Cascade- Correlation networks are used to compare and contrast the generalization ability of Haar wavelet and Fourier transform encoded signature data.

  10. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  11. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  12. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  13. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  14. Hill-Climbing Attacks and Robust Online Signature Verification Algorithm against Hill-Climbing Attacks

    Science.gov (United States)

    Muramatsu, Daigo

    Attacks using hill-climbing methods have been reported as a vulnerability of biometric authentication systems. In this paper, we propose a robust online signature verification algorithm against such attacks. Specifically, the attack considered in this paper is a hill-climbing forged data attack. Artificial forgeries are generated offline by using the hill-climbing method, and the forgeries are input to a target system to be attacked. In this paper, we analyze the menace of hill-climbing forged data attacks using six types of hill-climbing forged data and propose a robust algorithm by incorporating the hill-climbing method into an online signature verification algorithm. Experiments to evaluate the proposed system were performed using a public online signature database. The proposed algorithm showed improved performance against this kind of attack.

  15. Dynamical Signatures of Living Systems

    Science.gov (United States)

    Zak, M.

    1999-01-01

    One of the main challenges in modeling living systems is to distinguish a random walk of physical origin (for instance, Brownian motions) from those of biological origin and that will constitute the starting point of the proposed approach. As conjectured, the biological random walk must be nonlinear. Indeed, any stochastic Markov process can be described by linear Fokker-Planck equation (or its discretized version), only that type of process has been observed in the inanimate world. However, all such processes always converge to a stable (ergodic or periodic) state, i.e., to the states of a lower complexity and high entropy. At the same time, the evolution of living systems directed toward a higher level of complexity if complexity is associated with a number of structural variations. The simplest way to mimic such a tendency is to incorporate a nonlinearity into the random walk; then the probability evolution will attain the features of diffusion equation: the formation and dissipation of shock waves initiated by small shallow wave disturbances. As a result, the evolution never "dies:" it produces new different configurations which are accompanied by an increase or decrease of entropy (the decrease takes place during formation of shock waves, the increase-during their dissipation). In other words, the evolution can be directed "against the second law of thermodynamics" by forming patterns outside of equilibrium in the probability space. Due to that, a specie is not locked up in a certain pattern of behavior: it still can perform a variety of motions, and only the statistics of these motions is constrained by this pattern. It should be emphasized that such a "twist" is based upon the concept of reflection, i.e., the existence of the self-image (adopted from psychology). The model consists of a generator of stochastic processes which represents the motor dynamics in the form of nonlinear random walks, and a simulator of the nonlinear version of the diffusion

  16. Blockwise Binary Pattern: a Robust and AN Efficient Approach for Offline Signature Verification

    Science.gov (United States)

    Shekar, B. H.; Pilar, B.; Sunil, K. D. S.

    2017-05-01

    This paper presents a variant of local binary pattern called Blockwise Binary Pattern (BBP) for the offline signature verification. The proposed approach has three major phases : Preprocessing, Feature extraction and Classification. In the feature extraction phase, the signature is divided into 3 x 3 neighborhood blocks. A BBP value for central pixel of each block is computed by considering its 8 neighboring pixels and the 3 x 3 block is replaced by this central pixel. To compute BBP value for each block, a binary sequence is formed by considering 8 neighbors of the central pixel, by following the pixels in a anti-clockwise direction. Then the minimum decimal equivalent of this binary sequence is computed and this value is assigned to the central pixel. The central pixel is merged with the neighboring 8 pixels representing the 3 X 3 neighborhood block. This method is found to be invariant to rotation, scaling and shift of the signature. The features are stored in the form of normalized histogram. The SVM classifier is used for the signature verification. Experiments have been performed on standard signature datasets namely CEDAR and GPDS which are publicly available English signature datasets and on MUKOS, a regional language (Kannada) dataset and compared with the well-known approaches to exhibit the performance of the proposed approach.

  17. A New Approach for High Pressure Pixel Polar Distribution on Off-line Signature Verification

    Directory of Open Access Journals (Sweden)

    Jesús F. Vargas

    2010-06-01

    Full Text Available Features representing information of High Pressure Points froma static image of a handwritten signature are analyzed for an offline verification system. From grayscale images, a new approach for High Pressure threshold estimation is proposed. Two images, one containingthe High Pressure Points extracted and other with a binary version ofthe original signature, are transformed to polar coordinates where a pixel density ratio between them is calculated. Polar space had been divided into angular and radial segments, which permit a local analysis of the high pressure distribution. Finally two vectors containing the density distribution ratio are calculated for nearest and farthest points from geometric center of the original signature image. Experiments were carried out using a database containing signature from 160 individual. The robustness of the analyzed system for simple forgeries is tested out with Support Vector Machines models. For the sake of completeness, a comparison of the results obtained by the proposed approach with similar works published is presented.

  18. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  19. On the pinned field image binarization for signature generation in image ownership verification method

    Directory of Open Access Journals (Sweden)

    Chang Hsuan

    2011-01-01

    Full Text Available Abstract The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49 (9, 097005, 2010. While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81 (7, 1118-1129, 2008, the proposed scheme also has better performance.

  20. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers.

    Science.gov (United States)

    Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.

  1. The Comparison of Signature Verification Result Using 2DPCA Method and SSE Method

    Directory of Open Access Journals (Sweden)

    Anita Sindar R M Sinaga

    2018-02-01

    Full Text Available The rate of speed and validation verify to be a reference of quality information and reliable results. Everyone has signature characteristics but it will be difficult to match original signatures with a clone. Two Dimensional Principal Component Analysis (2DPCA method, Sum Equal Error (SSE method includes a method that can provide accurate data verification value of 90% - 98%. Results of scanned signatures, converted from RGB image - grayscale - black white (binary color. The extraction process of each method requires experimental data as a data source in pixel size. Digital image consists of a collection of pixels then each image is converted in a matrix. Preprocessing Method 2 DPCA each data is divided into data planning and data testing. Extraction on SSE method, each data sought histogram value and total black value. This study yields a comparison of the suitability of the extraction results of each method. Both of these methods have a data accuracy rate of 97% - 98%. When compared to the results of the accuracy of image verification with 2DPCA method: SSE is 97%: 96%. With the same data source will be tested result of 2DPCA method with SSE method.

  2. Online Signature Verification: To What Extent Should a Classifier be Trusted in?

    Directory of Open Access Journals (Sweden)

    Marianela Parodi

    2017-08-01

    Full Text Available To select the best features to model the signatures is one of the major challenges in the field of online signature verification. To combine different feature sets, selected by different criteria, is a useful technique to address this problem. In this line, the analysis of different features and their discriminative power has been researchers’ main concern, paying less attention to the way in which the different kind of features are combined. Moreover, the fact that conflicting results may appear when several classifiers are being used, has rarely been taken into account. In this paper, a score level fusion scheme is proposed to combine three different and meaningful feature sets, viz., an automatically selected feature set, a feature set relevant to Forensic Handwriting Experts (FHEs, and a global feature set. The score level fusion is performed within the framework of the Belief Function Theory (BFT, in order to address the problem of the conflicting results appearing when multiple classifiers are being used. Two different models, namely, the Denoeux and the Appriou models, are used to embed the problem within this framework, where the fusion is performed resorting to two well-known combination rules, namely, the Dempster-Shafer (DS and the Proportional Conflict Redistribution (PCR5 one. In order to analyze the robustness of the proposed score level fusion approach, the combination is performed for the same verification system using two different classification techniques, namely, Ramdon Forests (RF and Support Vector Machines (SVM. Experimental results, on a publicly available database, show that the proposed score level fusion approach allows the system to have a very good trade-off between verification results and reliability.

  3. Multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2017-11-01

    Full Text Available Reliable authentication in mobile applications is among the most important information security challenges. Today, we can hardly imagine a person who would not own a mobile device that connects to the Internet. Mobile devices are being used to store large amounts of confidential information, ranging from personal photos to electronic banking tools. In 2009, colleagues from Rice University together with their collaborators from Motorola, proposed an authentication through in-air gestures. This and subsequent work contributing to the development of the method are reviewed in our introduction. At the moment, there exists a version of the gesture-based authentication software available for Android mobile devices. This software has not become widespread yet. One of likely reasons for that is the insufficient reliability of the method, which involves similar to its earlier analogs the use of only one device. Here we discuss the authentication based on the multimodal three-dimensional dynamic signature (MTDS performed by two independent mobile devices. The MTDS-based authentication technique is an advanced version of in-air gesture authentication. We describe the operation of a prototype of MTDS-based authentication, including the main implemented algorithms, as well as some preliminary results of testing the software. We expect that our method can be used in any mobile application, provided a number of additional improvements discussed in the conclusion are made.

  4. A Behavioral Handwriting Model for Static and Dynamic Signature Synthesis.

    Science.gov (United States)

    Ferrer, Miguel A; Diaz, Moises; Carmona-Duarte, Cristina; Morales, Aythami

    2017-06-01

    The synthetic generation of static handwritten signatures based on motor equivalence theory has been recently proposed for biometric applications. Motor equivalence divides the human handwriting action into an effector dependent cognitive level and an effector independent motor level. The first level has been suggested by others as an engram, generated through a spatial grid, and the second has been emulated with kinematic filters. Our paper proposes a development of this methodology in which we generate dynamic information and provide a unified comprehensive synthesizer for both static and dynamic signature synthesis. The dynamics are calculated by lognormal sampling of the 8-connected continuous signature trajectory, which includes, as a novelty, the pen-ups. The forgery generation imitates a signature by extracting the most perceptually relevant points of the given genuine signature and interpolating them. The capacity to synthesize both static and dynamic signatures using a unique model is evaluated according to its ability to adapt to the static and dynamic signature inter- and intra-personal variability. Our highly promising results suggest the possibility of using the synthesizer in different areas beyond the generation of unlimited databases for biometric training.

  5. A Dynamic Continuous Signature Monitoring Technique for Reliable Microprocessors

    OpenAIRE

    Sugihara, Makoto; 杉原, 真

    2011-01-01

    Reliability issues such as a soft error and NBTI (negative bias temperature instability) have become a matter of concern as integrated circuits continue to shrink. It is getting more and more important to take reliability requirements into account even for consumer products. This paper presents a dynamic continuous signature monitoring (DCSM) technique for high reliable computer systems. The DCSM technique dynamically generates reference signatures as well as runtime ones during executing a p...

  6. Temporal logic runtime verification of dynamic systems

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-07-01

    Full Text Available Robotic computer systems are increasingly ubiquitous in everyday life and this has led to a need to develop safe and reliable systems. To ensure safety and reliability of these systems, the following three main verification techniques are usually...

  7. Structural signatures of dynamic heterogeneities in monolayers of colloidal ellipsoids

    NARCIS (Netherlands)

    Zheng, Z.Y.; Ni, R.; Wang, F.; Dijkstra, M.; Wang, Y.R.; Han, Y.L.

    2014-01-01

    When a liquid is supercooled towards the glass transition, its dynamics drastically slows down, whereas its static structure remains relatively unchanged. Finding a structural signature of the dynamic slowing down is a major challenge, yet it is often too subtle to be uncovered. Here we discover the

  8. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  9. Dynamics Verification Experiment of the Stewart Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhu-Feng Shao

    2015-10-01

    Full Text Available As the basis of dynamic analysis and driving force calculation, dynamic models and dynamic parameters are important issues in mechanical design and control. In this paper, a dynamics verification experiment, which covers both dynamic models and dynamic parameters as a whole, is carried out on the typical Stewart parallel manipulator. First, the complete dynamic model of the Stewart manipulator is derived, considering the force sensors. The Newton-Euler method with clear physical meaning is adopted to facilitate understanding and parameter definitions. The dynamic parameters are deduced based on the established three-dimensional virtual prototype and adjusted with actual measurements. The recorded trajectory, instead of the theory trajectory, is adopted to calculate the theoretical limb forces. The practical limb forces are measured using pull pressure sensors. Finally, the dynamic model and identified parameters are verified by comparing the limb forces obtained using the above two approaches. Experiment results show that theoretical and practical limb forces coincide well, with a small maximum RMS (root mean square error of 1.516N and forces ranging from 10N to 40N. Additionally, the established dynamics verification algorithm, which involves dynamic modelling, a parameter identification approach and a data analysis method, are generic and practical, and can be flexibly applied to the dynamic analysis of other parallel manipulators.

  10. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  11. Understanding the dynamic ionospheric signature of the plasmapause (Invited)

    Science.gov (United States)

    Moldwin, M.; Sibanda, P.; Zou, S.; Yizengaw, E.

    2010-12-01

    The equatorial edge of the mid-latitude trough has been shown to be an ionospheric signature of the plasmapause from both ground-based and space-based observations. However, identifying the trough is not always possible due to broad latitudinal density gradients, local time and seasonal effects, and storm and substorm dynamics. We review the current methods of identifying the trough from ground and space-based observations and describe the main deficiencies in these methods especially for tracking the trough/plasmapause during storms and substorms. We discuss the ionospheric signature of plasmaspheric plumes and their relationship to trough/plasmapause signatures. We conclude with some new multi-instrument observations that help clarify the ionospheric trough signature during geomagnetically active periods.

  12. Early signatures of regime shifts in complex dynamical systems

    Indian Academy of Sciences (India)

    2015-02-05

    Feb 5, 2015 ... A large number of studies have recently been carried out on the early signatures of regime shifts in a number of dynamical systems, e.g., ecosystems, the climate, fish and wildlife populations, ... Noise-induced regime shifts are also possible for which the vicinity of the bifurcation point is not essential. In this ...

  13. Early signatures of regime shifts in complex dynamical systems

    Indian Academy of Sciences (India)

    2015-02-05

    Feb 5, 2015 ... Abstract. A large number of studies have recently been carried out on the early signatures of regime shifts in a number of dynamical systems, e.g., ecosystems, the climate, fish and wildlife populations, financial markets, complex diseases and gene circuits. The underlying model in most cases is that of the ...

  14. Early signatures of regime shifts in gene expression dynamics

    International Nuclear Information System (INIS)

    Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani

    2013-01-01

    Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence. (paper)

  15. A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.

    Science.gov (United States)

    Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce

    2018-01-01

    A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification. © 2017 American Academy of Forensic Sciences.

  16. Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    J. Coetzer

    2004-04-01

    Full Text Available We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT and a hidden Markov model (HMM. Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER of 18% when only high-quality forgeries (skilled forgeries are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.

  17. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  18. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    Energy Technology Data Exchange (ETDEWEB)

    Van Buren, Kendra L. [Los Alamos National Laboratory; Canfield, Jesse M. [Los Alamos National Laboratory; Hemez, Francois M. [Los Alamos National Laboratory; Sauer, Jeremy A. [Los Alamos National Laboratory

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  19. Antisense transcription-dependent chromatin signature modulates sense transcript dynamics.

    Science.gov (United States)

    Brown, Thomas; Howe, Françoise S; Murray, Struan C; Wouters, Meredith; Lorenz, Philipp; Seward, Emily; Rata, Scott; Angel, Andrew; Mellor, Jane

    2018-02-12

    Antisense transcription is widespread in genomes. Despite large differences in gene size and architecture, we find that yeast and human genes share a unique, antisense transcription-associated chromatin signature. We asked whether this signature is related to a biological function for antisense transcription. Using quantitative RNA-FISH, we observed changes in sense transcript distributions in nuclei and cytoplasm as antisense transcript levels were altered. To determine the mechanistic differences underlying these distributions, we developed a mathematical framework describing transcription from initiation to transcript degradation. At GAL1 , high levels of antisense transcription alter sense transcription dynamics, reducing rates of transcript production and processing, while increasing transcript stability. This relationship with transcript stability is also observed as a genome-wide association. Establishing the antisense transcription-associated chromatin signature through disruption of the Set3C histone deacetylase activity is sufficient to similarly change these rates even in the absence of antisense transcription. Thus, antisense transcription alters sense transcription dynamics in a chromatin-dependent manner. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  20. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  1. MERGER SIGNATURES IN THE DYNAMICS OF STAR-FORMING GAS

    International Nuclear Information System (INIS)

    Hung, Chao-Ling; Sanders, D. B.; Hayward, Christopher C.; Smith, Howard A.; Ashby, Matthew L. N.; Martínez-Galarza, Juan R.; Zezas, Andreas; Lanz, Lauranne

    2016-01-01

    The recent advent of integral field spectrographs and millimeter interferometers has revealed the internal dynamics of many hundreds of star-forming galaxies. Spatially resolved kinematics have been used to determine the dynamical status of star-forming galaxies with ambiguous morphologies, and constrain the importance of galaxy interactions during the assembly of galaxies. However, measuring the importance of interactions or galaxy merger rates requires knowledge of the systematics in kinematic diagnostics and the visible time with merger indicators. We analyze the dynamics of star-forming gas in a set of binary merger hydrodynamic simulations with stellar mass ratios of 1:1 and 1:4. We find that the evolution of kinematic asymmetries traced by star-forming gas mirrors morphological asymmetries derived from mock optical images, in which both merger indicators show the largest deviation from isolated disks during strong interaction phases. Based on a series of simulations with various initial disk orientations, orbital parameters, gas fractions, and mass ratios, we find that the merger signatures are visible for ∼0.2–0.4 Gyr with kinematic merger indicators but can be approximately twice as long for equal-mass mergers of massive gas-rich disk galaxies designed to be analogs of z ∼ 2–3 submillimeter galaxies. Merger signatures are most apparent after the second passage and before the black holes coalescence, but in some cases they persist up to several hundred Myr after coalescence. About 20%–60% of the simulated galaxies are not identified as mergers during the strong interaction phase, implying that galaxies undergoing violent merging process do not necessarily exhibit highly asymmetric kinematics in their star-forming gas. The lack of identifiable merger signatures in this population can lead to an underestimation of merger abundances in star-forming galaxies, and including them in samples of star-forming disks may bias the measurements of disk

  2. Massive Black Hole Binaries: Dynamical Evolution and Observational Signatures

    Directory of Open Access Journals (Sweden)

    M. Dotti

    2012-01-01

    Full Text Available The study of the dynamical evolution of massive black hole pairs in mergers is crucial in the context of a hierarchical galaxy formation scenario. The timescales for the formation and the coalescence of black hole binaries are still poorly constrained, resulting in large uncertainties in the expected rate of massive black hole binaries detectable in the electromagnetic and gravitational wave spectra. Here, we review the current theoretical understanding of the black hole pairing in galaxy mergers, with a particular attention to recent developments and open issues. We conclude with a review of the expected observational signatures of massive binaries and of the candidates discussed in literature to date.

  3. Inferring dynamic signatures of microbes in complex host ecosystems.

    Directory of Open Access Journals (Sweden)

    Georg K Gerber

    Full Text Available The human gut microbiota comprise a complex and dynamic ecosystem that profoundly affects host development and physiology. Standard approaches for analyzing time-series data of the microbiota involve computation of measures of ecological community diversity at each time-point, or measures of dissimilarity between pairs of time-points. Although these approaches, which treat data as static snapshots of microbial communities, can identify shifts in overall community structure, they fail to capture the dynamic properties of individual members of the microbiota and their contributions to the underlying time-varying behavior of host ecosystems. To address the limitations of current methods, we present a computational framework that uses continuous-time dynamical models coupled with Bayesian dimensionality adaptation methods to identify time-dependent signatures of individual microbial taxa within a host as well as across multiple hosts. We apply our framework to a publicly available dataset of 16S rRNA gene sequences from stool samples collected over ten months from multiple human subjects, each of whom received repeated courses of oral antibiotics. Using new diversity measures enabled by our framework, we discover groups of both phylogenetically close and distant bacterial taxa that exhibit consensus responses to antibiotic exposure across multiple human subjects. These consensus responses reveal a timeline for equilibration of sub-communities of micro-organisms with distinct physiologies, yielding insights into the successive changes that occur in microbial populations in the human gut after antibiotic treatments. Additionally, our framework leverages microbial signatures shared among human subjects to automatically design optimal experiments to interrogate dynamic properties of the microbiota in new studies. Overall, our approach provides a powerful, general-purpose framework for understanding the dynamic behaviors of complex microbial ecosystems

  4. The Importance of Hydrological Signature and Its Recurring Dynamics

    Science.gov (United States)

    Wendi, D.; Marwan, N.; Merz, B.

    2017-12-01

    Temporal changes in hydrology are known to be challenging to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, and land use change, could impact variably on space-time scales and influence or mask each other. Besides, data depicting these drivers are often not available. One conventional approach of analyzing the change is based on discrete points of magnitude (e.g. the frequency of recurring extreme discharge) and often linearly quantified and hence do not reveal the potential change in the hydrological process. Moreover, discharge series are often subject to measurement errors, such as rating curve error especially in the case of flood peaks where observation are derived through extrapolation. In this study, the system dynamics inferred from the hydrological signature (i.e. the shape of hydrograph) is being emphasized. One example is to see if certain flood dynamics (instead of flood peak) in the recent years, had also occurred in the past (or rather extraordinary), and if so what is its recurring rate and if there had been a shift in its occurrence in time or seasonality (e.g. earlier snow melt dominant flood). The utilization of hydrological signature here is extended beyond those of classical hydrology such as base flow index, recession and rising limb slope, and time to peak. It is in fact all these characteristics combined i.e. from the start until the end of the hydrograph. Recurrence plot is used as a method to quantify and visualize the recurring hydrological signature through its phase space trajectories, and usually in the order of dimension above 2. Such phase space trajectories are constructed by embedding the time series into a series of variables (i.e. number of dimension) corresponding to the time delay. Since the method is rather novel in

  5. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  6. Pen and platen, piezo-electric (Engineering Materials). [Signature verification for access to restricted areas

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  7. Pen and platen, piezo-electric (21 Aug 1978) (Engineering Materials). [Signature verification

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  8. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes

    International Nuclear Information System (INIS)

    Pestana, Rafael Cardoso Baptistini

    2013-01-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO 2 F 2 and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the 235 U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope 235 U, as well as the use of reprocessed material, through the identification of the 236 U isotope. The uncertainties obtained for the n( 235 U)/n( 238 U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  9. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  10. Static and dynamic thermal infrared signatures measured during the FESTER experiment: first results

    Science.gov (United States)

    Gunter, W. H.; February, F.; Seiffer, D. P.; Eisele, C.

    2016-10-01

    The First European South African Experiment (FESTER) was conducted over about a 10 month period at the Institute of Maritime Technology (IMT) in False Bay, South Africa. One of the principal goals was recording of static and dynamic thermal infrared signatures under different environmental conditions for both validations of existing thermal equilibrium signature prediction codes, but also to aid development of dynamic thermal signature models. A small scientific work boat (called Sea Lab) was used as the principal target and sensor platform. Painted metal plates of different thicknesses were also used as infrared targets on-board Sea Lab to study static/dynamic thermal signatures and were also fitted with pyrgeometers, pyrometers and iButton temperature sensors/loggers. First results focused on the variable of thermal signatures as function of environmental conditions and the accuracy of calculated source temperatures (from measured radiometric temperatures) compared to the physical temperature measurements of the plates.

  11. Possibilities of dynamic biometrics for authentication and the circumstances for using dynamic biometric signature

    Directory of Open Access Journals (Sweden)

    Frantisek Hortai

    2018-01-01

    Full Text Available New information technologies alongside their benefits also bring new dangers with themselves. It is difficult to decide which authentication tool to use and implement in the information systems and electronic documents. The final decision has to compromise among the facts that it faces several conflicting requirements: highly secure tool, to be a user-friendly and user simplicity method, ensure protection against errors and failures of users, speed of authentication and provide these features for a reasonable price. Even when the compromised solution is found it has to fulfill the given technology standards. For the listed reasons the paper argues one of the most natural biometric authentication method the dynamic biometric signature and lists its related standards. The paper also includes measurement evaluation which solves the independence between the person’s signature and device on which it was created

  12. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  13. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  14. The research of a new test method about dynamic target infrared spectral signature

    Science.gov (United States)

    Wu, Jiang-hui; Gao, Jiao-bo; Chen, Qing; Luo, Yan-ling; Li, Jiang-jun; Gao, Ze-dong; Wang, Nan; Gao, Meng

    2014-11-01

    The research on infrared spectral target signature shows great military importance in the domain of IR detection Recognition, IRCM, IR image guide and ir stealth etc. The measurements of infrared spectral of tactical targets have been a direct but effective technique in providing signatures for both analysis and simulation to missile seeker designers for many years. In order to deal with the problem of dynamic target infrared spectral signature, this paper presents a new method for acquiring and testing ir spectral radiation signatures of dynamic objects, which is based on an IR imager guiding the target and acquiring the scene at the same time, a FOV chopping scan infrared spectral radiometer alternatively testing the target and its background around ir spectral signature.ir imager and spectral radiometer have the same optical axis. The raw test data was processed according to a new deal with method. Principles and data processing methods were described in detail, test error also analyzed. Field test results showed that the method described in the above is right; the test error was reduced smaller, and can better satisfy the needs of acquiring dynamic target ir spectral signature.

  15. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  16. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  17. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  18. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  19. Experimental verification of WWER dynamical response to vertical seismic motion

    International Nuclear Information System (INIS)

    Pecinka, L.; Konstantinidis, S.

    1985-01-01

    In calculations of dynamical response of PWR internals to seismic excitation the well known double pendulum model is used. In order to verify the nature of this phenomena, simulated vertical seismic events in the NPS Dukovany (CSSR) have been performed. As an exciting source three mobil units VIBROSEIS manufactured by Texas Instruments have been used. The dynamical response of the RPV has been measured using 4 accelerometers, the results are presented in the usual form of power spectral densities. (orig.)

  20. A Signature Comparing Android Mobile Application Utilizing Feature Extracting Algorithms

    Directory of Open Access Journals (Sweden)

    Paul Grafilon

    2017-08-01

    Full Text Available The paper presented one of the application that can be done using smartphones camera. Nowadays forgery is one of the most undetected crimes. With the forensic technology used today it is still difficult for authorities to compare and define what a real signature is and what a forged signature is. A signature is a legal representation of a person. All transactions are based on a signature. Forgers may use a signature to sign illegal contracts and withdraw from bank accounts undetected. A signature can also be forged during election periods for repeated voting. Addressing the issues a signature should always be secure. Signature verification is a reduced problem that still poses a real challenge for researchers. The literature on signature verification is quite extensive and shows two main areas of research off-line and on-line systems. Off-line systems deal with a static image of the signature i.e. the result of the action of signing while on-line systems work on the dynamic process of generating the signature i.e. the action of signing itself. The researchers have found a way to resolve the concerns. A mobile application that integrates the camera to take a picture of a signature analyzes it and compares it to other signatures for verification. It will exist to help citizens to be more cautious and aware with issues regarding the signatures. This might also be relevant to help organizations and institutions such as banks and insurance companies in verifying signatures that may avoid unwanted transactions and identity theft. Furthermore this might help the authorities in the never ending battle against crime especially against forgers and thieves. The project aimed to design and develop a mobile application that integrates the smartphone camera for verifying and comparing signatures for security using the best algorithm possible. As the result of the development the said smartphone camera application is functional and reliable.

  1. Host?pathogen evolutionary signatures reveal dynamics and future invasions of vampire bat rabies

    OpenAIRE

    Streicker, Daniel G.; Winternitz, Jamie C.; Satterfield, Dara A.; Condori-Condori, Rene Edgar; Broos, Alice; Tello, Carlos; Recuenco, Sergio; Velasco-Villa, Andr?s; Altizer, Sonia; Valderrama, William

    2016-01-01

    Anticipating how epidemics will spread across landscapes requires understanding host dispersal events that are notoriously difficult to measure. Here, we contrast host and virus genetic signatures to resolve the spatiotemporal dynamics underlying geographic expansions of vampire bat rabies virus (VBRV) in Peru. Phylogenetic analysis revealed recent viral spread between populations that, according to extreme geographic structure in maternally inherited host mitochondrial DNA, appeared complete...

  2. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  3. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  4. Forged Signature Distinction Using Convolutional Neural Network for Feature Extraction

    Directory of Open Access Journals (Sweden)

    Seungsoo Nam

    2018-01-01

    Full Text Available This paper proposes a dynamic verification scheme for finger-drawn signatures in smartphones. As a dynamic feature, the movement of a smartphone is recorded with accelerometer sensors in the smartphone, in addition to the moving coordinates of the signature. To extract high-level longitudinal and topological features, the proposed scheme uses a convolution neural network (CNN for feature extraction, and not as a conventional classifier. We assume that a CNN trained with forged signatures can extract effective features (called S-vector, which are common in forging activities such as hesitation and delay before drawing the complicated part. The proposed scheme also exploits an autoencoder (AE as a classifier, and the S-vector is used as the input vector to the AE. An AE has high accuracy for the one-class distinction problem such as signature verification, and is also greatly dependent on the accuracy of input data. S-vector is valuable as the input of AE, and, consequently, could lead to improved verification accuracy especially for distinguishing forged signatures. Compared to the previous work, i.e., the MLP-based finger-drawn signature verification scheme, the proposed scheme decreases the equal error rate by 13.7%, specifically, from 18.1% to 4.4%, for discriminating forged signatures.

  5. A Signature of Inflation from Dynamical Supersymmetry Breaking

    CERN Document Server

    Kinney, W H; Kinney, William H.; Riotto, Antonio

    1998-01-01

    In models of cosmological inflation motivated by dynamical supersymmetry breaking, the potential driving inflation may be characterized by inverse powers of a scalar field. These models produce observables similar to those typical of the hybrid inflation scenario: negligible production of tensor (gravitational wave) modes, and a blue scalar spectral index. In this short note, we show that, unlike standard hybrid inflation models, dynamical supersymmetric inflation (DSI) predicts a measurable deviation from a power-law spectrum of fluctuations, with a variation in the scalar spectral index $|dn / d(\\ln k)|$ may be as large as 0.05. DSI can be observationally distinguished from other hybrid models with cosmic microwave background measurements of the planned sensitivity of the ESA's Planck Surveyor.

  6. The electronic identification, signature and security of information systems

    Directory of Open Access Journals (Sweden)

    Horovèák Pavel

    2002-12-01

    Full Text Available The contribution deals with the actual methods and technologies of information and communication systems security. It introduces the overview of electronic identification elements such as static password, dynamic password and single sign-on. Into this category belong also biometric and dynamic characteristics of verified person. Widespread is authentication based on identification elements ownership, such as various cards and authentication calculators. In the next part is specified a definition and characterization of electronic signature, its basic functions and certificate categories. Practical utilization of electronic signature consists of electronic signature acquirement, signature of outgoing email message, receiving of electronic signature and verification of electronic signature. The use of electronic signature is continuously growing and in connection with legislation development it exercises in all resorts.

  7. Issues in computational fluid dynamics code verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Blottner, F.G.

    1997-09-01

    A broad range of mathematical modeling errors of fluid flow physics and numerical approximation errors are addressed in computational fluid dynamics (CFD). It is strongly believed that if CFD is to have a major impact on the design of engineering hardware and flight systems, the level of confidence in complex simulations must substantially improve. To better understand the present limitations of CFD simulations, a wide variety of physical modeling, discretization, and solution errors are identified and discussed. Here, discretization and solution errors refer to all errors caused by conversion of the original partial differential, or integral, conservation equations representing the physical process, to algebraic equations and their solution on a computer. The impact of boundary conditions on the solution of the partial differential equations and their discrete representation will also be discussed. Throughout the article, clear distinctions are made between the analytical mathematical models of fluid dynamics and the numerical models. Lax`s Equivalence Theorem and its frailties in practical CFD solutions are pointed out. Distinctions are also made between the existence and uniqueness of solutions to the partial differential equations as opposed to the discrete equations. Two techniques are briefly discussed for the detection and quantification of certain types of discretization and grid resolution errors.

  8. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  9. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  10. Far-IR transparency and dynamic infrared signature control with novel conducting polymer systems

    Science.gov (United States)

    Chandrasekhar, Prasanna; Dooley, T. J.

    1995-09-01

    Materials which possess transparency, coupled with active controllability of this transparency in the infrared (IR), are today an increasingly important requirement, for varied applications. These applications include windows for IR sensors, IR-region flat panel displays used in camouflage as well as in communication and sight through night-vision goggles, coatings with dynamically controllable IR-emissivity, and thermal conservation coatings. Among stringent requirements for these applications are large dynamic ranges (color contrast), 'multi-color' or broad-band characteristics, extended cyclability, long memory retention, matrix addressability, small area fabricability, low power consumption, and environmental stability. Among materials possessing the requirements for variation of IR signature, conducting polymers (CPs) appear to be the only materials with dynamic, actively controllable signature and acceptable dynamic range. Conventional CPs such as poly(alkyl thiophene), poly(pyrrole) or poly(aniline) show very limited dynamic range, especially in the far-IR, while also showing poor transparency. We have developed a number of novel CP systems ('system' implying the CP, the selected dopant, the synthesis method, and the electrolyte) with very wide dynamic range (up to 90% in both important IR regions, 3 - 5 (mu) and 8 - 12 (mu) ), high cyclability (to 105 cycles with less than 10% optical degradation), nearly indefinite optical memory retention, matrix addressability of multi-pixel displays, very wide operating temperature and excellent environmental stability, low charge capacity, and processability into areas from less than 1 mm2 to more than 100 cm2. The criteria used to design and arrive at these CP systems, together with representative IR signature data, are presented in this paper.

  11. Nonlinear dynamic analysis of multi-base seismically isolated structures with uplift potential II: verification examples

    Science.gov (United States)

    Roussis, Panayiotis C.; Tsopelas, Panos C.; Constantinou, Michael C.

    2010-03-01

    The work presented in this paper serves as numerical verification of the analytical model developed in the companion paper for nonlinear dynamic analysis of multi-base seismically isolated structures. To this end, two numerical examples have been analyzed using the computational algorithm incorporated into program 3D-BASIS-ME-MB, developed on the basis of the newly-formulated analytical model. The first example concerns a seven-story model structure that was tested on the earthquake simulator at the University at Buffalo and was also used as a verification example for program SAP2000. The second example concerns a two-tower, multi-story structure with a split-level seismic-isolation system. For purposes of verification, key results produced by 3D-BASIS-ME-MB are compared to experimental results, or results obtained from other structural/finite element programs. In both examples, the analyzed structure is excited under conditions of bearing uplift, thus yielding a case of much interest in verifying the capabilities of the developed analysis tool.

  12. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  13. Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequences and Doppler Signatures.

    Science.gov (United States)

    Zhou, Zhi; Cao, Zongjie; Pi, Yiming

    2017-12-21

    The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar.

  14. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  15. Conspray dynamic sleeve piston coal feeder. Phase II. Verification tests. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-26

    This report details the performance of Phase II: Verification Tests of the Conspray dynamic sleeve piston coal feeder. The machine performed for 200 hours at 700 psi backpressure, utilizing a 70% to 200 mesh Utah bituminous coal as feedstock. All test work was satisfactorily completed. A post-test inspection was performed. A report of component wear and failures incurred in testing is included as well as suggestions for machine upgrades. The overall conclusion is that the dynamic sleeve piston feeder has proven its ability to operate safely and reliably. When problems have occurred, the machine has demonstrated inherent safety by shutting down without endangering process or personnel. With the recommended improvements incorporated into the feeder, the unit will be ready for installation on a pilot scale coal gasifier. 9 figures, 11 tables.

  16. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  17. Approaches to determining the reliability of a multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2018-03-01

    Full Text Available The market of modern mobile applications has increasingly strict requirements for the authentication system reliability. This article examines an authentication method using a multimodal three-dimensional dynamic signature (MTDS, that can be used both as a main and additional method of user authentication in mobile applications. It is based on the use of gesture in the air performed by two independent mobile devices as an identifier. The MTDS method has certain advantages over currently used biometric methods, including fingerprint authentication, face recognition and voice recognition. A multimodal three-dimensional dynamic signature allows quickly changing an authentication gesture, as well as concealing the authentication procedure using gestures that do not attract attention. Despite all its advantages, the MTDS method has certain limitations, the main one is building functionally dynamic complex (FDC skills required for accurate repeating an authentication gesture. To correctly create MTDS need to have a system for assessing the reliability of gestures. Approaches to the solution of this task are grouped in this article according to methods of their implementation. Two of the approaches can be implemented only with the use of a server as a centralized MTDS processing center and one approach can be implemented using smartphone's own computing resources. The final part of the article provides data of testing one of these methods on a template performing the MTDS authentication.

  18. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    Purpose: To verify the accuracy of conformal isodose distributions and absolute doses delivered with a dynamic IMR system. Methods and materials: 21 patients treated with advanced or recurrent disease with a dynamic IMR system, of which 13 were immobilized with head screws, and 8, with non-invasive plastic masks. The system included immobilization techniques, computerized tomography (CT), a dynamic pencil beam multileaf collimator (MLC), a collimator controller computer, collimator safety interlocks, a simulated annealing optimization implemented on a dedicated quad processing computer system, phantoms embedded with dosemeters, patient setup and dose delivery techniques, in vivo dose verification, and a comprehensive quality assurance program. The collimator consisted of a 2 x 20 array of Tungsten leaves, each programmable to be either fully open or shut, thus offering 2 40 beam patterns with cross sectional areas of up to 4 x 20 cm at the linear accelerator (linac) gantry rotational axis. Any of these patterns were dynamically changeable per degree sign of gantry rotation. An anthropomorphic phantom composed of transverse anatomic slabs helped simulate patient geometry relative to immobilization devices, fiducial markers, CT and treatment room lasers, and linac rotational axis. Before each treatment regimen, the compliance of measured to planned doses was tested in phantom irradiations using each patient's fiducial markers, immobilization system, anatomic positioning, and collimator sequencing. Films and thermoluminescent dosemeters (TLD) were embedded in the phantom to measure absolute doses and dose distributions. Because the planner didn't account for variable electron density distributions in head and neck targets, the air cavities of the anthropomorphic phantom were filled with tissue equivalent bolus. Optical density distributions of films exposed to the dynamic IMR of each patient were obtained with a Hurter-Driffield calibration curved based on films

  19. Single Molecule Cluster Analysis Identifies Signature Dynamic Conformations along the Splicing Pathway

    Science.gov (United States)

    Blanco, Mario R.; Martin, Joshua S.; Kahlscheuer, Matthew L.; Krishnan, Ramya; Abelson, John; Laederach, Alain; Walter, Nils G.

    2016-01-01

    The spliceosome is the dynamic RNA-protein machine responsible for faithfully splicing introns from precursor messenger RNAs (pre-mRNAs). Many of the dynamic processes required for the proper assembly, catalytic activation, and disassembly of the spliceosome as it acts on its pre-mRNA substrate remain poorly understood, a challenge that persists for many biomolecular machines. Here, we developed a fluorescence-based Single Molecule Cluster Analysis (SiMCAn) tool to dissect the manifold conformational dynamics of a pre-mRNA through the splicing cycle. By clustering common dynamic behaviors derived from selectively blocked splicing reactions, SiMCAn was able to identify signature conformations and dynamic behaviors of multiple ATP-dependent intermediates. In addition, it identified a conformation adopted late in splicing by a 3′ splice site mutant, invoking a mechanism for substrate proofreading. SiMCAn presents a novel framework for interpreting complex single molecule behaviors that should prove widely useful for the comprehensive analysis of a plethora of dynamic cellular machines. PMID:26414013

  20. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  1. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  2. The ICDAR 2009 Signature Verification Competition

    NARCIS (Netherlands)

    Blankers, V.L.; Heuvel, C.E. van den; Franke, K.Y.; Vuurpijl, L.G.

    2009-01-01

    Recent results of forgery detection by implementing biometric signature verification methods are promising. At present, forensic signature verification in daily casework is performed through visual examination by trained forensic handwriting experts, without reliance on computerassisted methods.

  3. Characterizing the anthropogenic signature in the LCLU dynamics in the Central Asia region

    Science.gov (United States)

    Tatarskii, V.; Sokolik, I. N.; de Beurs, K.; Shiklomanov, A. I.

    2017-12-01

    Humans have been changing the LCLU dynamics over time through the world. In the Central Asia region, these changes have been especially pronounced due to the political and economic transformation. We present a detailed analysis, focusing on identifying and quantifying the anthropogenic signature in the water and land use across the region. We have characterized the anthropogenic dust emission by combining the modeling and observations. The model is a fully coupled model called WRF-Chem-DuMo that takes explicitly into account the vegetation treatment in modeling the dust emission. We have reconstructed the anthropogenic dust sources in the region, such as the retreat of the Aral Sea, changes in agricultural fields, etc. In addition, we characterize the anthropogenic water use dynamics, including the changes in the water use for the agricultural production. Furthermore, we perform an analysis to identify the anthropogenic signature in the NDVI pattern. The NDVI were analyzed in conjunction with the meteorological fields that were simulated at the high special resolution using the WRF model. Meteorological fields of precipitation and temperature were used for the correlation analysis to separate the natural vs. anthropogenic changes. In this manner, we were able to identify the regions that have been affected by human activities. We will present the quantitative assessment of the anthropogenic changes. The diverse consequences for the economy of the region, as well as, the environment will be addressed.

  4. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans

    Science.gov (United States)

    Michelmann, Sebastian; Bowman, Howard; Hanslmayr, Simon

    2016-01-01

    Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz) power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans. PMID:27494601

  5. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans.

    Directory of Open Access Journals (Sweden)

    Sebastian Michelmann

    2016-08-01

    Full Text Available Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans.

  6. Extended verification of the model of dynamic near-surface layer of the atmosphere

    Science.gov (United States)

    Polnikov, V. G.

    2013-07-01

    This paper formulates the most general principles for verifying models of the dynamic near-water layer of the atmosphere (DNWLA) and performs an advanced verification of the model proposed by the author earlier [6]. Based on empirical wave spectra from the studies by Donelan [15], Elfouhaily [14], and Kudryavtsev [13] and well-known empirical laws describing the wave-age dependence of the friction coefficient, we adjusted the original version of the model. It was shown that the improvement of model reliability is most dependent on the adequacy of the parameterization of the tangential portion of the total momentum flux to the wavy surface. Then the new version of the model was verified on the basis of field data from two different groups of authors. It was found that the new version of the model is consistent with empirical data with an error not exceeding the measurement error of near-water layer parameters.

  7. Dynamic transcriptional signatures and network responses for clinical symptoms in influenza-infected human subjects using systems biology approaches.

    Science.gov (United States)

    Linel, Patrice; Wu, Shuang; Deng, Nan; Wu, Hulin

    2014-10-01

    Recent studies demonstrate that human blood transcriptional signatures may be used to support diagnosis and clinical decisions for acute respiratory viral infections such as influenza. In this article, we propose to use a newly developed systems biology approach for time course gene expression data to identify significant dynamically response genes and dynamic gene network responses to viral infection. We illustrate the methodological pipeline by reanalyzing the time course gene expression data from a study with healthy human subjects challenged by live influenza virus. We observed clear differences in the number of significant dynamic response genes (DRGs) between the symptomatic and asymptomatic subjects and also identified DRG signatures for symptomatic subjects with influenza infection. The 505 common DRGs shared by the symptomatic subjects have high consistency with the signature genes for predicting viral infection identified in previous works. The temporal response patterns and network response features were carefully analyzed and investigated.

  8. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  9. Assessment of signature handwriting evidence via score-based likelihood ratio based on comparative measurement of relevant dynamic features.

    Science.gov (United States)

    Chen, Xiao-Hong; Champod, Christophe; Yang, Xu; Shi, Shao-Pei; Luo, Yi-Wen; Wang, Nan; Wang, Ya-Chen; Lu, Qi-Meng

    2018-01-01

    This paper extends on previous research on the extraction and statistical analysis on relevant dynamic features (width, grayscale and radian combined with writing sequence information) in forensic handwriting examinations. In this paper, a larger signature database was gathered, including genuine signatures, freehand imitation signatures, random forgeries and tracing imitation signatures, which are often encountered in casework. After applying Principle Component Analysis (PCA) of the variables describing the proximity between specimens, a two-dimensional kernel density estimation was used to describe the variability of within-genuine comparisons and genuine-forgery comparisons. We show that the overlap between the within-genuine comparisons and the genuine-forgery comparisons depends on the imitated writer and on the forger as well. Then, in order to simulate casework conditions, cases were simulated by random sampling based on the collected signature dataset. Three-dimensional normal density estimation was used to estimate the numerator and denominator probability distribution used to compute a likelihood ratio (LR). The comparisons between the performance of the systems in SigComp2011 (based on static features) and the method presented in this paper (based on relevant dynamic features) showed that relevant dynamic features are better than static features in terms of accuracy, false acceptance rate, false rejection rate and calibration of likelihood ratios. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Static-Analysis Assisted Dynamic Verification of MPI Waitany Programs (Poster Abstract)

    Science.gov (United States)

    Vakkalanka, Sarvani; Szubzda, Grzegorz; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    It is well known that the number of schedules (interleavings) of a concurrent program grows exponentially with the number of processes. Our previous work has demonstrated the advantages of an MPI-specific dynamic partial order reduction (DPOR, [5]) algorithm called POE in a tool called ISP [1,2,3] in dramatically reducing the number of interleavings during formal dynamic verification. Higher degrees of interleaving reduction were achieved when the programs were deterministic. In this work, we consider the problem of verifying MPI using MPI_Waitany (and related operations wait/test some/all). Such programs potentially have a higher degree of non-determinism. For such programs, POE can become ineffective, as shown momentarily. To solve this problem, we employ static analysis (supported by ROSE [4]) in a supporting role to POE to determine the extent to which the out parameters of MPI_Waitany can affect subsequent control flow statements. This informs ISP’s scheduler to exert even more intelligent backtrack/replay control.

  11. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  12. Impact of seaweed beachings on dynamics of δ15N isotopic signatures in marine macroalgae

    International Nuclear Information System (INIS)

    Lemesle, Stéphanie; Mussio, Isabelle; Rusig, Anne-Marie; Menet-Nédélec, Florence; Claquin, Pascal

    2015-01-01

    Highlights: • Two coastal sites (COU, GM) in the Bay of Seine affected by summer seaweed beachings. • The same temporal dynamics of the algal δ 15 N at the two sites. • N and P concentrations in seawater of the two sites dominated by riverine sources. • A coupling between seaweed beachings and N sources of intertidal macroalgae. - Abstract: A fine-scale survey of δ 15 N, δ 13 C, tissue-N in seaweeds was conducted using samples from 17 sampling points at two sites (Grandcamp-Maisy (GM), Courseulles/Mer (COU)) along the French coast of the English Channel in 2012 and 2013. Partial triadic analysis was performed on the parameter data sets and revealed the functioning of three areas: one estuary (EstA) and two rocky areas (GM ∗ , COU ∗ ). In contrast to oceanic and anthropogenic reference points similar temporal dynamics characterized δ 15 N signatures and N contents at GM ∗ and COU ∗ . Nutrient dynamics were similar: the N-concentrations in seawater originated from the River Seine and local coastal rivers while P-concentrations mainly from these local rivers. δ 15 N at GM ∗ were linked to turbidity suggesting inputs of autochthonous organic matter from large-scale summer seaweed beachings made up of a mixture of Rhodophyta, Phaeophyta and Chlorophyta species. This study highlights the coupling between seaweed beachings and nitrogen sources of intertidal macroalgae

  13. Biological signatures of dynamic river networks from a coupled landscape evolution and neutral community model

    Science.gov (United States)

    Stokes, M.; Perron, J. T.

    2017-12-01

    Freshwater systems host exceptionally species-rich communities whose spatial structure is dictated by the topology of the river networks they inhabit. Over geologic time, river networks are dynamic; drainage basins shrink and grow, and river capture establishes new connections between previously separated regions. It has been hypothesized that these changes in river network structure influence the evolution of life by exchanging and isolating species, perhaps boosting biodiversity in the process. However, no general model exists to predict the evolutionary consequences of landscape change. We couple a neutral community model of freshwater organisms to a landscape evolution model in which the river network undergoes drainage divide migration and repeated river capture. Neutral community models are macro-ecological models that include stochastic speciation and dispersal to produce realistic patterns of biodiversity. We explore the consequences of three modes of speciation - point mutation, time-protracted, and vicariant (geographic) speciation - by tracking patterns of diversity in time and comparing the final result to an equilibrium solution of the neutral model on the final landscape. Under point mutation, a simple model of stochastic and instantaneous speciation, the results are identical to the equilibrium solution and indicate the dominance of the species-area relationship in forming patterns of diversity. The number of species in a basin is proportional to its area, and regional species richness reaches its maximum when drainage area is evenly distributed among sub-basins. Time-protracted speciation is also modeled as a stochastic process, but in order to produce more realistic rates of diversification, speciation is not assumed to be instantaneous. Rather, each new species must persist for a certain amount of time before it is considered to be established. When vicariance (geographic speciation) is included, there is a transient signature of increased

  14. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  15. A method for geometrical verification of dynamic intensity modulated radiotherapy using a scanning electronic portal imaging device

    International Nuclear Information System (INIS)

    Ploeger, Lennert S.; Smitsmans, Monique H.P.; Gilhuijs, Kenneth G.A.; Herk, Marcel van

    2002-01-01

    In order to guarantee the safe delivery of dynamic intensity modulated radiotherapy (IMRT), verification of the leaf trajectories during the treatment is necessary. Our aim in this study is to develop a method for on-line verification of leaf trajectories using an electronic portal imaging device with scanning read-out, independent of the multileaf collimator. Examples of such scanning imagers are electronic portal imaging devices (EPIDs) based on liquid-filled ionization chambers and those based on amorphous silicon. Portal images were acquired continuously with a liquid-filled ionization chamber EPID during the delivery, together with the signal of treatment progress that is generated by the accelerator. For each portal image, the prescribed leaf and diaphragm positions were computed from the dynamic prescription and the progress information. Motion distortion effects of the leaves are corrected based on the treatment progress that is recorded for each image row. The aperture formed by the prescribed leaves and diaphragms is used as the reference field edge, while the actual field edge is found using a maximum-gradient edge detector. The errors in leaf and diaphragm position are found from the deviations between the reference field edge and the detected field edge. Earlier measurements of the dynamic EPID response show that the accuracy of the detected field edge is better than 1 mm. To ensure that the verification is independent of inaccuracies in the acquired progress signal, the signal was checked with diode measurements beforehand. The method was tested on three different dynamic prescriptions. Using the described method, we correctly reproduced the distorted field edges. Verifying a single portal image took 0.1 s on an 866 MHz personal computer. Two flaws in the control system of our experimental dynamic multileaf collimator were correctly revealed with our method. First, the errors in leaf position increase with leaf speed, indicating a delay of

  16. Dynamic oscillatory signatures of central neuropathic pain in spinal cord injury.

    Science.gov (United States)

    Vuckovic, Aleksandra; Hasan, Muhammad A; Fraser, Matthew; Conway, Bernard A; Nasseroleslami, Bahman; Allan, David B

    2014-06-01

    Central neuropathic pain (CNP) is believed to be accompanied by increased activation of the sensorimotor cortex. Our knowledge of this interaction is based mainly on functional magnetic resonance imaging studies, but there is little direct evidence on how these changes manifest in terms of dynamic neuronal activity. This study reports on the presence of transient electroencephalography (EEG)-based measures of brain activity during motor imagery in spinal cord-injured patients with CNP. We analyzed dynamic EEG responses during imaginary movements of arms and legs in 3 groups of 10 volunteers each, comprising able-bodied people, paraplegic patients with CNP (lower abdomen and legs), and paraplegic patients without CNP. Paraplegic patients with CNP had increased event-related desynchronization in the theta, alpha, and beta bands (16-24 Hz) during imagination of movement of both nonpainful (arms) and painful limbs (legs). Compared to patients with CNP, paraplegics with no pain showed a much reduced power in relaxed state and reduced event-related desynchronization during imagination of movement. Understanding these complex dynamic, frequency-specific activations in CNP in the absence of nociceptive stimuli could inform the design of interventional therapies for patients with CNP and possibly further understanding of the mechanisms involved. This study compares the EEG activity of spinal cord-injured patients with CNP to that of spinal cord-injured patients with no pain and also to that of able-bodied people. The study shows that the presence of CNP itself leads to frequency-specific EEG signatures that could be used to monitor CNP and inform neuromodulatory treatments of this type of pain. Copyright © 2014 American Pain Society. Published by Elsevier Inc. All rights reserved.

  17. Comparison of rapid assessment techniques for signatures of dynamic equilibrium in a disturbed stream

    Science.gov (United States)

    Blersch, S. S.; Habberfield, M.

    2009-12-01

    Dynamic equilibrium is a concept that ecologists and fluvial geomorphologists readily use to describe natural systems in their respective fields. The existence of numerous rapid assessment techniques to quantify the dynamic nature of streams, however, shows that consensus has not yet been reached on what a guiding image might be. A brief review of these techniques shows a disparity of perspective between the various academic disciplines that focus on streams; indeed, traditional biology has its index of biotic integrity, geomorphology has its channel stability schemes, and hydrology has its discharge relationship curves. Reviewing these assessment techniques in the context of general systems theory provides a means to select techniques that will inform the dynamic image of the river in its ecologically restored state. In terms of assessment for restoration planning, on which elements of the signature should one focus—the drivers (e.g. the physical properties of the stream) or the responses (e.g. the biological community)? What is the appropriate scale at which one should be working within a stream to determine its state of dynamic equilibrium? To answer these questions, visual-based assessments techniques were compared and contrasted at a potential stream restoration site at an active gravel mining area in Elton Creek in Western New York. Two techniques focused on channel stability only, while the third included channel stability and biological indicators. A comparison was made to determine if different rankings would result using each method, and whether one or the other was more advantageous to establishing restoration design criteria. The two channel stability methods ranked all four reaches tested in the same order, while the biological assessment ranked two of the four differently. The two reaches that did not have the same ranking were fairly close in their overall scores, and were classified in the same category in each method. Those methods with a larger

  18. The Gravitational Wave Signature of Stellar Collapse and Dynamics of Compact Stars

    Science.gov (United States)

    Abdikamalov, E. B.

    2009-10-01

    This thesis is devoted to the study of the gravitational wave (GW) signature of stellar collapse and the dynamical behavior compact stars. The thesis consists of two parts. In the first one, we study the dynamics of the phase-transition-induced collapse of neutron stars (NSs) and the accretion-induced collapse of white dwarfs (WDs) as well as the associated GW emission. The second part is concerned with the study of the effects of general relativity on the magnetosphere of oscillating NSs. An increase in the central density of a NS may trigger a phase transition from hadronic matter to deconfined quark matter in the core, causing it to collapse to a more compact hybrid-star configuration. We present a study of this, using general relativistic hydrodynamics simulations with a simplified equation of state and considering the case of supersonic phase transition. We confirm that the emitted GW spectrum is dominated by the fundamental quasi-radial and quadrupolar pulsation modes. We observe a nonlinear mode resonance which substantially enhances the emission in some cases. We explain the damping mechanisms operating and estimate the detectability of the GWs. In massive accreting oxygen-neon WDs, their core material may in several circumstances experience rapid electron captures leading to collapse of the WD to a protoneutron star and collapse-driven supernova (SN) explosion. This process is called accretion-induced collapse (AIC) and represents a path alternative to thermonuclear disruption of accreting WDs in Type Ia SNe. An AIC-driven SN explosion is expected to be weak and of short duration, making it hard to detect by electromagnetic means alone. Neutrino and GW observations may provide crucial information necessary to reveal a potential AIC event. Motivated by the need for systematic predictions of the GW signature of AIC, we present results from an extensive set of general-relativistic simulations of AIC using a microphysical finite-temperature equation of state

  19. Analysis of signature wrapping attacks and countermeasures

    DEFF Research Database (Denmark)

    Gajek, Sebastian; Jensen, Meiko; Liao, Lijun

    2009-01-01

    In recent research it turned out that Boolean verification, of digital signatures in the context of WSSecurity, is likely to fail: If parts of a SOAP message, are signed and the signature verification applied to, the whole document returns true, then nevertheless the, document may have been...

  20. Dynamic responses to silicon in Thalassiosira pseudonana - Identification, characterisation and classification of signature genes and their corresponding protein motifs

    OpenAIRE

    Brembu, Tore; Chauton, Matilde Skogen; Winge, Per; Bones, Atle M.; Vadstein, Olav

    2017-01-01

    The diatom cell wall, or frustule, is a highly complex, three-dimensional structure consisting of nanopatterned silica as well as proteins and other organic components. While some key components have been identified, knowledge on frustule biosynthesis is still fragmented. The model diatom Thalassiosira pseudonana was subjected to silicon (Si) shift-up and shift-down situations. Cellular and molecular signatures, dynamic changes and co-regulated clusters representing the hallmarks of cellular ...

  1. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and Discretization

    Directory of Open Access Journals (Sweden)

    Wai Kuan Yip

    2007-01-01

    Full Text Available We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT and discrete fourier transform (DFT. Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs of and for random and skilled forgeries for stolen token (worst case scenario, and for both forgeries in the genuine token (optimal scenario.

  2. A signature of attractor dynamics in the CA3 region of the hippocampus.

    Directory of Open Access Journals (Sweden)

    César Rennó-Costa

    2014-05-01

    Full Text Available The notion of attractor networks is the leading hypothesis for how associative memories are stored and recalled. A defining anatomical feature of such networks is excitatory recurrent connections. These "attract" the firing pattern of the network to a stored pattern, even when the external input is incomplete (pattern completion. The CA3 region of the hippocampus has been postulated to be such an attractor network; however, the experimental evidence has been ambiguous, leading to the suggestion that CA3 is not an attractor network. In order to resolve this controversy and to better understand how CA3 functions, we simulated CA3 and its input structures. In our simulation, we could reproduce critical experimental results and establish the criteria for identifying attractor properties. Notably, under conditions in which there is continuous input, the output should be "attracted" to a stored pattern. However, contrary to previous expectations, as a pattern is gradually "morphed" from one stored pattern to another, a sharp transition between output patterns is not expected. The observed firing patterns of CA3 meet these criteria and can be quantitatively accounted for by our model. Notably, as morphing proceeds, the activity pattern in the dentate gyrus changes; in contrast, the activity pattern in the downstream CA3 network is attracted to a stored pattern and thus undergoes little change. We furthermore show that other aspects of the observed firing patterns can be explained by learning that occurs during behavioral testing. The CA3 thus displays both the learning and recall signatures of an attractor network. These observations, taken together with existing anatomical and behavioral evidence, make the strong case that CA3 constructs associative memories based on attractor dynamics.

  3. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  4. Reuse and Abstraction in Verification: Agents Acting in a Dynamic Environment

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.; de Vries, W.; Ciancarini, P.; Wooldridge, M.J.

    2001-01-01

    To make verification a manageable part of the system development process, comprehensibility and reusability of properties and proofs is essential. The work reported in this paper contributes formally founded methods that support proof structuring and reuse. Often occurring patterns in agent

  5. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry

    International Nuclear Information System (INIS)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-01-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  6. Analysis of the effects and relationship of perceived handwritten signature's size, graphical complexity, and legibility with dynamic parameters for forged and genuine samples.

    Science.gov (United States)

    Ahmad, Sharifah Mumtazah Syed; Ling, Loo Yim; Anwar, Rina Md; Faudzi, Masyura Ahmad; Shakil, Asma

    2013-05-01

    This article presents an analysis of handwritten signature dynamics belonging to two authentication groups, namely genuine and forged signature samples. Genuine signatures are initially classified based on their relative size, graphical complexity, and legibility as perceived by human examiners. A pool of dynamic features is then extracted for each signature sample in the two groups. A two-way analysis of variance (ANOVA) is carried out to investigate the effects and the relationship between the perceived classifications and the authentication groups. Homogeneity of variance was ensured through Bartlett's test prior to ANOVA testing. The results demonstrated that among all the investigated dynamic features, pen pressure is the most distinctive which is significantly different for the two authentication groups as well as for the different perceived classifications. In addition, all the relationships investigated, namely authenticity group versus size, graphical complexity, and legibility, were found to be positive for pen pressure. © 2013 American Academy of Forensic Sciences.

  7. Analysis of deviations found in the verification of dynamic IMRT treatments with XIO planner

    International Nuclear Information System (INIS)

    Martinez Ortega, J.; Castro Tejero, P.; Quintana Paz, A.

    2011-01-01

    At present, intensity-modulated radiotherapy (IMRT) has become a standard technique, providing excellent results in several diseases.The complexity of this treatment technique requires experimental verification of each treatment, so it is found that the difference between the dose calculated by the scheduler and provides the gas is within tolerable limits. In this paper, on the experience gained from the implementation of the technique, analyzing the differences obtained between the calculated dose and dose measurement in quality control.

  8. Terahertz signatures of the exciton formation dynamics in non-resonantly excited semiconductors

    Science.gov (United States)

    Kira, M.; Hoyer, W.; Koch, S. W.

    2004-03-01

    A microscopic theory for the induced terahertz (THz) absorption of semiconductors is applied to study the time-dependent system response after non-resonant optical excitation. The formation of excitonic populations from an interacting electron-hole plasma is analyzed and the characteristic THz signatures are computed. Good qualitative agreement with recent experiments is obtained.

  9. Dosimetric parameters of enhanced dynamic wedge for treatment planning and verification

    International Nuclear Information System (INIS)

    Leavitt, Dennis D.; Lee, Wing Lok; Gaffney, David K.

    1996-01-01

    Purpose/Objective: Enhanced Dynamic Wedge (EDW) is an intensity-modulated radiotherapy technique in which one collimating jaw sweeps across the field to define a desired wedge dose distribution while dose rate is modified according to jaw position. This tool enables discrete or continuous wedge angles from zero to sixty degrees for field widths from three cm to 30 cm in the direction of the wedge, and up to 40 cm perpendicular to the wedge direction. Additionally, asymmetric wedge fields not centered on the line through isocenter can be created for applications such as tangential breast irradiation. The unique range of field shapes and wedge angles introduce a new set of dosimetric challenges to be resolved before routine clinical use of EDW, and especially require that a simple set of independent dose calculation and verification techniques be developed to check computerized treatment planning results. Using terminology in common use in treatment planning, this work defines the effective wedge factor vs. field width and wedge angle, evaluates the depth dose vs. open field values, defines primary intensity functions from which specific dynamic wedges can be calculated in treatment planning systems, and describes the technique for independent calculation of Monitor Units for EDW fields. Materials and Methods: Using 6- and 18-MV beams from a CI2100C, EDW beam profiles were measured in water phantom for depths from near-surface to 30 cm for the full range of field widths and wedge angles using a linear detector array of 25 energy-compensated diodes. Asymmetric wedge field profiles were likewise measured. Depth doses were measured in water phantom using an ionization chamber sequentially positioned to depths of 30 cm. Effective wedge factors for the full range of field widths and wedge angles were measured using an ionization chamber in water-equivalent plastic at a depth of 10 cm on central axis. Dose profiles were calculated by computer as the summation of a series

  10. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  11. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    Science.gov (United States)

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of

  12. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  13. Nitrate denitrification with nitrite or nitrous oxide as intermediate products: Stoichiometry, kinetics and dynamics of stable isotope signatures.

    Science.gov (United States)

    Vavilin, V A; Rytov, S V

    2015-09-01

    A kinetic analysis of nitrate denitrification by a single or two species of denitrifying bacteria with glucose or ethanol as a carbon source and nitrite or nitrous oxide as intermediate products was performed using experimental data published earlier (Menyailo and Hungate, 2006; Vidal-Gavilan et al., 2013). Modified Monod kinetics was used in the dynamic biological model. The special equations were added to the common dynamic biological model to describe how isotopic fractionation between N species changes. In contrast to the generally assumed first-order kinetics, in this paper, the traditional Rayleigh equation describing stable nitrogen and oxygen isotope fractionation in nitrate was derived from the dynamic isotopic equations for any type of kinetics. In accordance with the model, in Vidal-Gavilan's experiments, the maximum specific rate of nitrate reduction was proved to be less for ethanol compared to glucose. Conversely, the maximum specific rate of nitrite reduction was proved to be much less for glucose compared to ethanol. Thus, the intermediate nitrite concentration was negligible for the ethanol experiment, while it was significant for the glucose experiment. In Menyailo's and Hungate's experiments, the low value of maximum specific rate of nitrous oxide reduction gives high intermediate value of nitrous oxide concentration. The model showed that the dynamics of nitrogen and oxygen isotope signatures are responding to the biological dynamics. Two microbial species instead of single denitrifying bacteria are proved to be more adequate to describe the total process of nitrate denitrification to dinitrogen. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    OpenAIRE

    Shrirang Ambaji KULKARNI; Raghavendra G . RAO

    2017-01-01

    Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique ...

  15. An efficient modified Elliptic Curve Digital Signature Algorithm | Kiros ...

    African Journals Online (AJOL)

    Many digital signatures which are based on Elliptic Curves Cryptography (ECC) have been proposed. Among these digital signatures, the Elliptic Curve Digital Signature Algorithm (ECDSA) is the widely standardized one. However, the verification process of ECDSA is slower than the signature generation process. Hence ...

  16. Observational Signatures of Transverse Magnetohydrodynamic Waves and Associated Dynamic Instabilities in Coronal Flux Tubes

    Energy Technology Data Exchange (ETDEWEB)

    Antolin, P.; Moortel, I. De [School of Mathematics and Statistics, University of St. Andrews, St. Andrews, Fife KY16 9SS (United Kingdom); Doorsselaere, T. Van [Centre for mathematical Plasma Astrophysics, Mathematics Department, KU Leuven, Celestijnenlaan 200B bus 2400, B-3001 Leuven (Belgium); Yokoyama, T., E-mail: patrick.antolin@st-andrews.ac.uk [Department of Earth and Planetary Science, The University of Tokyo, Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2017-02-20

    Magnetohydrodynamic (MHD) waves permeate the solar atmosphere and constitute potential coronal heating agents. Yet, the waves detected so far may be but a small subset of the true existing wave power. Detection is limited by instrumental constraints but also by wave processes that localize the wave power in undetectable spatial scales. In this study, we conduct 3D MHD simulations and forward modeling of standing transverse MHD waves in coronal loops with uniform and non-uniform temperature variation in the perpendicular cross-section. The observed signatures are largely dominated by the combination of the Kelvin–Helmholtz instability (KHI), resonant absorption, and phase mixing. In the presence of a cross-loop temperature gradient, we find that emission lines sensitive to the loop core catch different signatures compared to those that are more sensitive to the loop boundary and the surrounding corona, leading to an out-of-phase intensity and Doppler velocity modulation produced by KHI mixing. In all of the considered models, common signatures include an intensity and loop width modulation at half the kink period, a fine strand-like structure, a characteristic arrow-shaped structure in the Doppler maps, and overall line broadening in time but particularly at the loop edges. For our model, most of these features can be captured with a spatial resolution of 0.″33 and a spectral resolution of 25 km s{sup −1}, although we do obtain severe over-estimation of the line width. Resonant absorption leads to a significant decrease of the observed kinetic energy from Doppler motions over time, which is not recovered by a corresponding increase in the line width from phase mixing and KHI motions. We estimate this hidden wave energy to be a factor of 5–10 of the observed value.

  17. Signatures of quantum phase transitions in the dynamic response of fluxonium qubit chains

    Science.gov (United States)

    Meier, Hendrik; Brierley, R. T.; Kou, Angela; Girvin, S. M.; Glazman, Leonid I.

    2015-08-01

    We evaluate the microwave admittance of a one-dimensional chain of fluxonium qubits coupled by shared inductors. Despite its simplicity, this system exhibits a rich phase diagram. A critical applied magnetic flux separates a homogeneous ground state from a phase with a ground state exhibiting inhomogeneous persistent currents. Depending on the parameters of the array, the phase transition may be a conventional continuous one, or of a commensurate-incommensurate nature. Furthermore, quantum fluctuations affect the transition and possibly lead to the presence of gapless "floating phases." The signatures of the soft modes accompanying the transitions appear as a characteristic frequency dependence of the dissipative part of admittance.

  18. Computer-aided classification of lesions by means of their kinetic signatures in dynamic contrast-enhanced MR images

    Science.gov (United States)

    Twellmann, Thorsten; ter Haar Romeny, Bart

    2008-03-01

    The kinetic characteristics of tissue in dynamic contrast-enhanced magnetic resonance imaging data are an important source of information for the differentiation of benign and malignant lesions. Kinetic curves measured for each lesion voxel allow to infer information about the state of the local tissue. As a whole, they reflect the heterogeneity of the vascular structure within a lesion, an important criterion for the preoperative classification of lesions. Current clinical practice in analysis of tissue kinetics however is mainly based on the evaluation of the "most-suspect curve", which is only related to a small, manually or semi-automatically selected region-of-interest within a lesion and does not reflect any information about tissue heterogeneity. We propose a new method which exploits the full range of kinetic information for the automatic classification of lesions. Instead of breaking down the large amount of kinetic information to a single curve, each lesion is considered as a probability distribution in a space of kinetic features, efficiently represented by its kinetic signature obtained by adaptive vector quantization of the corresponding kinetic curves. Dissimilarity of two signatures can be objectively measured using the Mallows distance, which is a metric defined on probability distributions. The embedding of this metric in a suitable kernel function enables us to employ modern kernel-based machine learning techniques for the classification of signatures. In a study considering 81 breast lesions, the proposed method yielded an A z value of 0.89+/-0.01 for the discrimination of benign and malignant lesions in a nested leave-one-lesion-out evaluation setting.

  19. Dynamic response signatures of a scaled model platform for floating wind turbines in an ocean wave basin

    Science.gov (United States)

    Jaksic, V.; O'Shea, R.; Cahill, P.; Murphy, J.; Mandic, D. P.; Pakrashi, V.

    2015-01-01

    Understanding of dynamic behaviour of offshore wind floating substructures is extremely important in relation to design, operation, maintenance and management of floating wind farms. This paper presents assessment of nonlinear signatures of dynamic responses of a scaled tension-leg platform (TLP) in a wave tank exposed to different regular wave conditions and sea states characterized by the Bretschneider, the Pierson–Moskowitz and the JONSWAP spectra. Dynamic responses of the TLP were monitored at different locations using load cells, a camera-based motion recognition system and a laser Doppler vibrometer. The analysis of variability of the TLP responses and statistical quantification of their linearity or nonlinearity, as non-destructive means of structural monitoring from the output-only condition, remains a challenging problem. In this study, the delay vector variance (DVV) method is used to statistically study the degree of nonlinearity of measured response signals from a TLP. DVV is observed to create a marker estimating the degree to which a change in signal nonlinearity reflects real-time behaviour of the structure and also to establish the sensitivity of the instruments employed to these changes. The findings can be helpful in establishing monitoring strategies and control strategies for undesirable levels or types of dynamic response and can help to better estimate changes in system characteristics over the life cycle of the structure. PMID:25583866

  20. Dynamic response signatures of a scaled model platform for floating wind turbines in an ocean wave basin.

    Science.gov (United States)

    Jaksic, V; O'Shea, R; Cahill, P; Murphy, J; Mandic, D P; Pakrashi, V

    2015-02-28

    Understanding of dynamic behaviour of offshore wind floating substructures is extremely important in relation to design, operation, maintenance and management of floating wind farms. This paper presents assessment of nonlinear signatures of dynamic responses of a scaled tension-leg platform (TLP) in a wave tank exposed to different regular wave conditions and sea states characterized by the Bretschneider, the Pierson-Moskowitz and the JONSWAP spectra. Dynamic responses of the TLP were monitored at different locations using load cells, a camera-based motion recognition system and a laser Doppler vibrometer. The analysis of variability of the TLP responses and statistical quantification of their linearity or nonlinearity, as non-destructive means of structural monitoring from the output-only condition, remains a challenging problem. In this study, the delay vector variance (DVV) method is used to statistically study the degree of nonlinearity of measured response signals from a TLP. DVV is observed to create a marker estimating the degree to which a change in signal nonlinearity reflects real-time behaviour of the structure and also to establish the sensitivity of the instruments employed to these changes. The findings can be helpful in establishing monitoring strategies and control strategies for undesirable levels or types of dynamic response and can help to better estimate changes in system characteristics over the life cycle of the structure. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Andersen, Morten Thøtt; Robertson, Amy N.

    2016-01-01

    The open-source, aero-hydro-servo-elastic wind turbine simulation software FAST v8 (created by the National Renewable Energy Laboratory) was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed b...

  2. Geometric Verification of Dynamic Wave Arc Delivery With the Vero System Using Orthogonal X-ray Fluoroscopic Imaging

    International Nuclear Information System (INIS)

    Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Gevaert, Thierry; Depuydt, Tom; Tournel, Koen; Hung, Cecilia; Simon, Viorica; Hiraoka, Masahiro; Ridder, Mark de

    2015-01-01

    Purpose: The purpose of this study was to define an independent verification method based on on-board orthogonal fluoroscopy to determine the geometric accuracy of synchronized gantry–ring (G/R) rotations during dynamic wave arc (DWA) delivery available on the Vero system. Methods and Materials: A verification method for DWA was developed to calculate O-ring-gantry (G/R) positional information from ball-bearing positions retrieved from fluoroscopic images of a cubic phantom acquired during DWA delivery. Different noncoplanar trajectories were generated in order to investigate the influence of path complexity on delivery accuracy. The G/R positions detected from the fluoroscopy images (DetPositions) were benchmarked against the G/R angulations retrieved from the control points (CP) of the DWA RT plan and the DWA log files recorded by the treatment console during DWA delivery (LogActed). The G/R rotational accuracy was quantified as the mean absolute deviation ± standard deviation. The maximum G/R absolute deviation was calculated as the maximum 3-dimensional distance between the CP and the closest DetPositions. Results: In the CP versus DetPositions comparison, an overall mean G/R deviation of 0.13°/0.16° ± 0.16°/0.16° was obtained, with a maximum G/R deviation of 0.6°/0.2°. For the LogActed versus DetPositions evaluation, the overall mean deviation was 0.08°/0.15° ± 0.10°/0.10° with a maximum G/R of 0.3°/0.4°. The largest decoupled deviations registered for gantry and ring were 0.6° and 0.4° respectively. No directional dependence was observed between clockwise and counterclockwise rotations. Doubling the dose resulted in a double number of detected points around each CP, and an angular deviation reduction in all cases. Conclusions: An independent geometric quality assurance approach was developed for DWA delivery verification and was successfully applied on diverse trajectories. Results showed that the Vero system is capable of following complex

  3. Potential energy landscape signatures of slow dynamics in glass forming liquids

    DEFF Research Database (Denmark)

    Sastry, S.; Debenedetti, P. G.; Stillinger, F. H.

    1999-01-01

    We study the properties of local potential energy minima (‘inherent structures’) sampled by liquids at low temperatures as an approach to elucidating the mechanisms of the observed dynamical slowing down observed as the glass transition temperature is approached. This onset of slow dynamics...... inherent structure basins from that due to inter-basin transitions becomes valid at temperatures T...

  4. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  5. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  6. STUDIES OF ACOUSTIC EMISSION SIGNATURES FOR QUALITY ASSURANCE OF SS 316L WELDED SAMPLES UNDER DYNAMIC LOAD CONDITIONS

    Directory of Open Access Journals (Sweden)

    S. V. RANGANAYAKULU

    2016-10-01

    Full Text Available Acoustic Emission (AE signatures of various weld defects of stainless steel 316L nuclear grade weld material are investigated. The samples are fabricated by Tungsten Inert Gas (TIG Welding Method have final dimension of 140 mm x 15 mm x 10 mm. AE signals from weld defects such as Pinhole, Porosity, Lack of Penetration, Lack of Side Fusion and Slag are recorded under dynamic load conditions by specially designed mechanical jig. AE features of the weld defects were attained using Linear Location Technique (LLT. The results from this study concluded that, stress release and structure deformation between the sections in welding area are load conditions major part of Acoustic Emission activity during loading.

  7. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian; Robertson, Amy; Jonkman, Jason; Andersen, Morten T.

    2016-08-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by the commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.

  8. DISCRETE DYNAMIC MODEL OF BEVEL GEAR – VERIFICATION THE PROGRAM SOURCE CODE FOR NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-06-01

    Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

  9. Signatures of chaos and non-integrability in two-dimensional gravity with dynamical boundary

    Directory of Open Access Journals (Sweden)

    Fitkevich Maxim

    2016-01-01

    Full Text Available We propose a model of two-dimensional dilaton gravity with a boundary. In the bulk our model coincides with the classically integrable CGHS model; the dynamical boundary cuts of the CGHS strong-coupling region. As a result, classical dynamics in our model reminds that in the spherically-symmetric gravity: wave packets of matter fields either reflect from the boundary or form black holes. We find large integrable sector of multisoliton solutions in this model. At the same time, we argue that the model is globally non-integrable because solutions at the verge of black hole formation display chaotic properties.

  10. Experimental Verification of Dynamic Operation of Continuous and Multivessel Batch Distillation

    Energy Technology Data Exchange (ETDEWEB)

    Wittgens, Bernd

    1999-07-01

    This thesis presents a rigorous model based on first principles for dynamic simulation of the composition dynamics of a staged high-purity continuous distillation columns and experiments performed to verify it. The thesis also demonstrates the importance of tray hydraulics to obtain good agreement between simulation and experiment and derives analytic expressions for dynamic time constants for use in simplified and vapour dynamics. A newly developed multivessel batch distillation column consisting of a reboiler, intermediate vessels and a condenser vessel provides a generalization of previously proposed batch distillation schemes. The total reflux operation of this column was presented previously and the present thesis proposes a simple feedback control strategy for its operation based on temperature measurements. The feasibility of this strategy is demonstrated by simulations and verified by laboratory experiments. It is concluded that the multivessel column can be easily operated with simple temperature controllers, where the holdups are only controlled indirectly. For a given set of temperature setpoints, the final product compositions are independent of the initial feed composition. When the multivessel batch distillation column is compared to a conventional batch column, both operated under feedback control, it is found that the energy required to separate a multicomponent mixture into highly pure products is much less for the multivessel system. This system is also the simplest one to operate.

  11. Host-pathogen evolutionary signatures reveal dynamics and future invasions of vampire bat rabies

    Czech Academy of Sciences Publication Activity Database

    Streicker, D. G.; Winternitz, Jamie Caroline; Satterfield, D. A.; Condori-Condori, R. E.; Broos, A.; Tello, C.; Recuenco, S.; Velasco-Villa, A.; Altizer, S.; Valderrama, W.

    2016-01-01

    Roč. 113, č. 39 (2016), s. 10926-10931 ISSN 0027-8424 Institutional support: RVO:68081766 Keywords : Desmodus * zoonotic disease * forecasting * sex bias * spatial dynamics Subject RIV: GJ - Animal Vermins ; Diseases, Veterinary Medicine Impact factor: 9.661, year: 2016

  12. Spectral signatures of the tropical Pacific dynamics from model and altimetry

    Science.gov (United States)

    Lionel, Tchilibou Michel; Gourdeau, Lionel; Morrow, Rosemary; Djath, Bugshin; Jouanno, Julien; Marin, Frederic

    2017-04-01

    The tropics are distinguishable from mid latitudes by their small Coriolis parameter vanishing at the equator, large Rossby radius, and strong anisotropic circulation. These peculiarities are at the origin of dynamics that strongly respond to the wind forcing through zonally propagating tropical waves, and of a large range of wavenumbers covering meso and submesoscale interactions. The main tropical meso and submesoscales features are associated with Tropical Instability Waves (Marchesiello et al., 2011), but coherent vorticity structures span the tropical band as described by Ubelmann and Fu (2011). This study aims to infer the dynamics of the tropical Pacific through spectral EKE and SSH analyses by looking at their latitudinal dependence. Also, a question of interest is the observability of such dynamics using along track altimetric wavenumber spectra since the tracks are mainly oriented meridionally in the tropics. This study is based on the 1.12° resolution DRAKKAR global model. Frequency-zonal wavenumber EKE spectra, and their corresponding 1D frequency and zonal wavenumber are analyzed in different latitudinal bands in the tropics illustrating the contrast between the dynamics in the equatorial belt and in the off -equatorial belt. Zonal and meridional wavenumber EKE spectra, and 2D (horizontal wavenumber) spectra of zonal and meridional velocities are used to illustrate the degree of anisotropy in the tropics depending on latitude. These EKE spectra and the relationship between EKE and SSH spectra helps us to discuss the validity of QG turbulence theories in the tropics. These model results combined with those from a 1/36° resolution regional model with explicit tides point out the actual limitation of along track altimetric SSH to infer small scale dynamics in the tropics due the high energy level of high frequency ageostrophic motions.

  13. Impact of Leaf Traits on Temporal Dynamics of Transpired Oxygen Isotope Signatures and Its Impact on Atmospheric Vapor

    Science.gov (United States)

    Dubbert, Maren; Kübert, Angelika; Werner, Christiane

    2017-01-01

    Oxygen isotope signatures of transpiration (δE) are powerful tracers of water movement from plant to global scale. However, a mechanistic understanding of how leaf morphological/physiological traits effect δE is missing. A laser spectrometer was coupled to a leaf-level gas-exchange system to measure fluxes and isotopic signatures of plant transpiration under controlled conditions in seven distinct species (Fagus sylvatica, Pinus sylvestris, Acacia longifolia, Quercus suber, Coffea arabica, Plantago lanceolata, Oxalis triangularis). We analyzed the role of stomatal conductance (gs) and leaf water content (W) on the temporal dynamics of δE following changes in relative humidity (rH). Changes in rH were applied from 60 to 30% and from 30 to 60%, which is probably more than covering the maximum step changes occurring under natural conditions. Further, the impact of gs and W on isotopic non-steady state isofluxes was analyzed. Following changes in rH, temporal development of δE was well described by a one-pool modeling approach for most species. Isofluxes of δE were dominantly driven by stomatal control on E, particularly for the initial period of 30 min following a step change. Hence, the deviation of isofluxes from isotopic steady state can be large, even though plants transpire near to isotopic steady state. Notably, not only transpiration rate and stomatal conductance, but also the leaf traits stomatal density (as a measure of gmax) and leaf water content are significantly related to the time constant (τ) and non-steady-state isofluxes. This might provide an easy-to-access means of a priori assumptions for the impact of isotopic non-steady-state transpiration in various ecosystems. We discuss the implications of our results from leaf to ecosystem scale. PMID:28149303

  14. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...

  15. A Rational Threshold Signature Model and Protocol Based on Different Permissions

    Directory of Open Access Journals (Sweden)

    Bojun Wang

    2014-01-01

    Full Text Available This paper develops a novel model and protocol used in some specific scenarios, in which the participants of multiple groups with different permissions can finish the signature together. We apply the secret sharing scheme based on difference equation to the private key distribution phase and secret reconstruction phrase of our threshold signature scheme. In addition, our scheme can achieve the signature success because of the punishment strategy of the repeated rational secret sharing. Besides, the bit commitment and verification method used to detect players’ cheating behavior acts as a contributing factor to prevent the internal fraud. Using bit commitments, verifiable parameters, and time sequences, this paper constructs a dynamic game model, which has the features of threshold signature management with different permissions, cheat proof, and forward security.

  16. RVB signatures in the spin dynamics of the square-lattice Heisenberg antiferromagnet

    Science.gov (United States)

    Ghioldi, E. A.; Gonzalez, M. G.; Manuel, L. O.; Trumper, A. E.

    2016-03-01

    We investigate the spin dynamics of the square-lattice spin-\\frac{1}{2} Heisenberg antiferromagnet by means of an improved mean-field Schwinger boson calculation. By identifying both, the long-range Néel and the RVB-like components of the ground state, we propose an educated guess for the mean-field magnetic excitation consisting on a linear combination of local and bond spin flips to compute the dynamical structure factor. Our main result is that when this magnetic excitation is optimized in such a way that the corresponding sum rule is fulfilled, we recover the low- and high-energy spectral weight features of the experimental spectrum. In particular, the anomalous spectral weight depletion at (π,0) found in recent inelastic neutron scattering experiments can be attributed to the interference of the triplet bond excitations of the RVB component of the ground state. We conclude that the Schwinger boson theory seems to be a good candidate to adequately interpret the dynamic properties of the square-lattice Heisenberg antiferromagnet.

  17. Phytoestrogens and Mycoestrogens Induce Signature Structure Dynamics Changes on Estrogen Receptor α

    Directory of Open Access Journals (Sweden)

    Xueyan Chen

    2016-08-01

    Full Text Available Endocrine disrupters include a broad spectrum of chemicals such as industrial chemicals, natural estrogens and androgens, synthetic estrogens and androgens. Phytoestrogens are widely present in diet and food supplements; mycoestrogens are frequently found in grains. As human beings and animals are commonly exposed to phytoestrogens and mycoestrogens in diet and environment, it is important to understand the potential beneficial or hazardous effects of estrogenic compounds. Many bioassays have been established to study the binding of estrogenic compounds with estrogen receptor (ER and provided rich data in the literature. However, limited assays can offer structure information with regard to the ligand/ER complex. Our current study surveys the global structure dynamics changes for ERα ligand binding domain (LBD when phytoestrogens and mycoestrogens bind. The assay is based on the structure dynamics information probed by hydrogen deuterium exchange mass spectrometry and offers a unique viewpoint to elucidate the mechanism how phytoestrogens and mycoestrogens interact with estrogen receptor. The cluster analysis based on the hydrogen deuterium exchange (HDX assay data reveals a unique pattern when phytoestrogens and mycoestrogens bind with ERα LBD compared to that of estradiol and synthetic estrogen modulators. Our study highlights that structure dynamics could play an important role in the structure function relationship when endocrine disrupters interact with estrogen receptors.

  18. Thermal dynamic behavior during selective laser melting of K418 superalloy: numerical simulation and experimental verification

    Science.gov (United States)

    Chen, Zhen; Xiang, Yu; Wei, Zhengying; Wei, Pei; Lu, Bingheng; Zhang, Lijuan; Du, Jun

    2018-04-01

    During selective laser melting (SLM) of K418 powder, the influence of the process parameters, such as laser power P and scanning speed v, on the dynamic thermal behavior and morphology of the melted tracks was investigated numerically. A 3D finite difference method was established to predict the dynamic thermal behavior and flow mechanism of K418 powder irradiated by a Gaussian laser beam. A three-dimensional randomly packed powder bed composed of spherical particles was established by discrete element method. The powder particle information including particle size distribution and packing density were taken into account. The volume shrinkage and temperature-dependent thermophysical parameters such as thermal conductivity, specific heat, and other physical properties were also considered. The volume of fluid method was applied to reconstruct the free surface of the molten pool during SLM. The geometrical features, continuity boundaries, and irregularities of the molten pool were proved to be largely determined by the laser energy density. The numerical results are in good agreement with the experiments, which prove to be reasonable and effective. The results provide us some in-depth insight into the complex physical behavior during SLM and guide the optimization of process parameters.

  19. On the dynamics of a plasma vortex street and its topological signatures

    International Nuclear Information System (INIS)

    Siregar, E.; Stribling, W.T.; Goldstein, M.L.

    1994-01-01

    A plasma vortex street configuration can evolve when two velocity and one magnetic shear layer interact strongly. A study of the interaction between two- and three-dimensional plasma modes and a mean sheared magnetic field is undertaken using a three-dimensional magnetohydrodynamic spectral Galerkin computation. The initial state is a simple magnetic shear in a plane perpendicular to the plasma velocity shear plane. In a very weak magnetic field, secondary instabilities (three-dimensional modes), expressed by the kinking of vortex tubes, lead to plasma flow along and around the axes of the vortex cores, creating characteristic patterns of kinetic helicity and linkages between vortex filaments. Three-dimensionality leads to the vortex breakdown process. A strong sheared magnetic field inhibits the kinking of vortex tubes, maintaining two-dimensionality. This inhibits vortex breakdown over long dynamical times. There is an anticorrelation in time between linkage indices of the vortex filament (related to kinetic helicity), suggesting that the ellipticity axes of the vortex cores along the street undergo a global inphase evolution. This anticorrelation has a dynamical interpretation. It extends to a relaxing plasma in the Navier--Stokes flow notion that helical regions of opposite helicities interact and screen each other off so that the global helicity remains bounded

  20. Understanding Biases in Ribosome Profiling Experiments Reveals Signatures of Translation Dynamics in Yeast.

    Directory of Open Access Journals (Sweden)

    Jeffrey A Hussmann

    2015-12-01

    Full Text Available Ribosome profiling produces snapshots of the locations of actively translating ribosomes on messenger RNAs. These snapshots can be used to make inferences about translation dynamics. Recent ribosome profiling studies in yeast, however, have reached contradictory conclusions regarding the average translation rate of each codon. Some experiments have used cycloheximide (CHX to stabilize ribosomes before measuring their positions, and these studies all counterintuitively report a weak negative correlation between the translation rate of a codon and the abundance of its cognate tRNA. In contrast, some experiments performed without CHX report strong positive correlations. To explain this contradiction, we identify unexpected patterns in ribosome density downstream of each type of codon in experiments that use CHX. These patterns are evidence that elongation continues to occur in the presence of CHX but with dramatically altered codon-specific elongation rates. The measured positions of ribosomes in these experiments therefore do not reflect the amounts of time ribosomes spend at each position in vivo. These results suggest that conclusions from experiments in yeast using CHX may need reexamination. In particular, we show that in all such experiments, codons decoded by less abundant tRNAs were in fact being translated more slowly before the addition of CHX disrupted these dynamics.

  1. Signatures of dynamics in charge transport through organic molecules; Dynamisches Verhalten beim Ladungstransport durch organische Molekuele

    Energy Technology Data Exchange (ETDEWEB)

    Secker, Daniel

    2008-06-03

    The aim of the thesis at hand was to investigate dynamical behaviour in charge transport through organic molecules experimentally with the help of the mechanically controlled break junction (MCBJ) technique. the thesis concentrates on the complex interaction between the molecular contact configuration and the electronic structure. it is shown that by variation of the electrode distance and so by a manipulation of the molecule and contact configuration the electronic structure as well as the coupling between the molecule and the electrodes is affected. The latter statement is an additional hint how closely I-V-characteristics depend on the molecular contact configuration. Depending on the applied voltage and so the electric field there are two different configurations preferred by the molecular contact. A potential barrier between these two states is the origin of the hysteresis. A central part of the thesis is dealing with measurements of the current noise. Finally it can be concluded that the detailed discussion reveals the strong effect of dynamical interactions between the atomic configuration of the molecular contact and the electronic structure on the charge transport in single molecule junctions. (orig.)

  2. Dynamical signatures of collective quality grading in a social activity: attendance to motion pictures.

    Science.gov (United States)

    Escobar, Juan V; Sornette, Didier

    2015-01-01

    We investigate the laws governing people's decisions and interactions by studying the collective dynamics of a well-documented social activity for which there exist ample records of the perceived quality: the attendance to movie theaters in the US. We picture the flows of attendance as impulses or "shocks" driven by external factors that in turn can create new cascades of attendances through direct recommendations whose effectiveness depends on the perceived quality of the movies. This corresponds to an epidemic branching model comprised of a decaying exponential function determining the time between cause and action, and a cascade of actions triggered by previous ones. We find that the vast majority of the ~3,500 movies studied fit our model remarkably well. From our results, we are able to translate a subjective concept such as movie quality into a probability of the deriving individual activity, and from it we build concrete quantitative predictions. Our analysis opens up the possibility of understanding other collective dynamics for which the perceived quality or appeal of an action is also known.

  3. Dynamical Signatures of Collective Quality Grading in a Social Activity: Attendance to Motion Pictures

    Science.gov (United States)

    Escobar, Juan V.; Sornette, Didier

    2015-01-01

    We investigate the laws governing people’s decisions and interactions by studying the collective dynamics of a well-documented social activity for which there exist ample records of the perceived quality: the attendance to movie theaters in the US. We picture the flows of attendance as impulses or “shocks” driven by external factors that in turn can create new cascades of attendances through direct recommendations whose effectiveness depends on the perceived quality of the movies. This corresponds to an epidemic branching model comprised of a decaying exponential function determining the time between cause and action, and a cascade of actions triggered by previous ones. We find that the vast majority of the ~3,500 movies studied fit our model remarkably well. From our results, we are able to translate a subjective concept such as movie quality into a probability of the deriving individual activity, and from it we build concrete quantitative predictions. Our analysis opens up the possibility of understanding other collective dynamics for which the perceived quality or appeal of an action is also known. PMID:25612292

  4. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  5. Vibrational signatures of cation-anion hydrogen bonding in ionic liquids: a periodic density functional theory and molecular dynamics study.

    Science.gov (United States)

    Mondal, Anirban; Balasubramanian, Sundaram

    2015-02-05

    Hydrogen bonding in alkylammonium based protic ionic liquids was studied using density functional theory (DFT) and ab initio molecular dynamics (AIMD) simulations. Normal-mode analysis within the harmonic approximation and power spectra of velocity autocorrelation functions were used as tools to obtain the vibrational spectra in both the gas phase and the crystalline phases of these protic ionic liquids. The hydrogen bond vibrational modes were identified in the 150-240 cm(-1) region of the far-infrared (far-IR) spectra. A blue shift in the far-IR mode was observed with an increasing number of hydrogen-bonding sites on the cation; the exact peak position is modulated by the cation-anion hydrogen bond strength. Sub-100 cm(-1) bands in the far-IR spectrum are assigned to the rattling motion of the anions. Calculated NMR chemical shifts of the acidic protons in the crystalline phase of these salts also exhibit the signature of cation-anion hydrogen bonding.

  6. Dynamic responses to silicon in Thalasiossira pseudonana - Identification, characterisation and classification of signature genes and their corresponding protein motifs.

    Science.gov (United States)

    Brembu, Tore; Chauton, Matilde Skogen; Winge, Per; Bones, Atle M; Vadstein, Olav

    2017-07-07

    The diatom cell wall, or frustule, is a highly complex, three-dimensional structure consisting of nanopatterned silica as well as proteins and other organic components. While some key components have been identified, knowledge on frustule biosynthesis is still fragmented. The model diatom Thalassiosira pseudonana was subjected to silicon (Si) shift-up and shift-down situations. Cellular and molecular signatures, dynamic changes and co-regulated clusters representing the hallmarks of cellular and molecular responses to changing Si availabilities were characterised. Ten new proteins with silaffin-like motifs, two kinases and a novel family of putatively frustule-associated transmembrane proteins induced by Si shift-up with a possible role in frustule biosynthesis were identified. A separate cluster analysis performed on all significantly regulated silaffin-like proteins (SFLPs), as well as silaffin-like motifs, resulted in the classification of silaffins, cingulins and SFLPs into distinct clusters. A majority of the genes in the Si-responsive clusters are highly divergent, but positive selection does not seem to be the driver behind this variability. This study provides a high-resolution map over transcriptional responses to changes in Si availability in T. pseudonana. Hallmark Si-responsive genes are identified, characteristic motifs and domains are classified, and taxonomic and evolutionary implications outlined and discussed.

  7. Interplay of community dynamics, temperature, and productivity on the hydrogen isotope signatures of lipid biomarkers

    Directory of Open Access Journals (Sweden)

    S. N. Ladd

    2017-09-01

    Full Text Available The hydrogen isotopic composition (δ2H of lipid biomarkers has diverse applications in the fields of paleoclimatology, biogeochemistry, and microbial community dynamics. Large changes in hydrogen isotope fractionation have been observed among microbes with differing core metabolisms, while environmental factors including temperature and nutrient availability can affect isotope fractionation by photoautotrophs. Much effort has gone into studying these effects under laboratory conditions with single species cultures. Moving beyond controlled environments and quantifying the natural extent of these changes in freshwater lacustrine settings and identifying their causes is essential for robust application of δ2H values of common short-chain fatty acids as a proxy of net community metabolism and of phytoplankton-specific biomarkers as a paleohydrologic proxy. This work targets the effect of community dynamics, temperature, and productivity on 2H∕1H fractionation in lipid biomarkers through a comparative time series in two central Swiss lakes: eutrophic Lake Greifen and oligotrophic Lake Lucerne. Particulate organic matter was collected from surface waters at six time points throughout the spring and summer of 2015, and δ2H values of short-chain fatty acids, as well as chlorophyll-derived phytol and the diatom biomarker brassicasterol, were measured. We paired these measurements with in situ incubations conducted with NaH13CO3, which were used to calculate the production rates of individual lipids in lake surface water. As algal productivity increased from April to June, net discrimination against 2H in Lake Greifen increased by as much as 148 ‰ for individual fatty acids. During the same time period in Lake Lucerne, net discrimination against 2H increased by as much as 58 ‰ for individual fatty acids. A large portion of this signal is likely due to a greater proportion of heterotrophically derived fatty acids in the winter and early

  8. Interplay of community dynamics, temperature, and productivity on the hydrogen isotope signatures of lipid biomarkers

    Science.gov (United States)

    Nemiah Ladd, S.; Dubois, Nathalie; Schubert, Carsten J.

    2017-09-01

    The hydrogen isotopic composition (δ2H) of lipid biomarkers has diverse applications in the fields of paleoclimatology, biogeochemistry, and microbial community dynamics. Large changes in hydrogen isotope fractionation have been observed among microbes with differing core metabolisms, while environmental factors including temperature and nutrient availability can affect isotope fractionation by photoautotrophs. Much effort has gone into studying these effects under laboratory conditions with single species cultures. Moving beyond controlled environments and quantifying the natural extent of these changes in freshwater lacustrine settings and identifying their causes is essential for robust application of δ2H values of common short-chain fatty acids as a proxy of net community metabolism and of phytoplankton-specific biomarkers as a paleohydrologic proxy. This work targets the effect of community dynamics, temperature, and productivity on 2H/1H fractionation in lipid biomarkers through a comparative time series in two central Swiss lakes: eutrophic Lake Greifen and oligotrophic Lake Lucerne. Particulate organic matter was collected from surface waters at six time points throughout the spring and summer of 2015, and δ2H values of short-chain fatty acids, as well as chlorophyll-derived phytol and the diatom biomarker brassicasterol, were measured. We paired these measurements with in situ incubations conducted with NaH13CO3, which were used to calculate the production rates of individual lipids in lake surface water. As algal productivity increased from April to June, net discrimination against 2H in Lake Greifen increased by as much as 148 ‰ for individual fatty acids. During the same time period in Lake Lucerne, net discrimination against 2H increased by as much as 58 ‰ for individual fatty acids. A large portion of this signal is likely due to a greater proportion of heterotrophically derived fatty acids in the winter and early spring, which are displaced by

  9. Dynamic CT myocardial perfusion imaging: detection of ischemia in a porcine model with FFR verification

    Science.gov (United States)

    Fahmi, Rachid; Eck, Brendan L.; Vembar, Mani; Bezerra, Hiram G.; Wilson, David L.

    2014-03-01

    Dynamic cardiac CT perfusion (CTP) is a high resolution, non-invasive technique for assessing myocardial blood ow (MBF), which in concert with coronary CT angiography enable CT to provide a unique, comprehensive, fast analysis of both coronary anatomy and functional ow. We assessed perfusion in a porcine model with and without coronary occlusion. To induce occlusion, each animal underwent left anterior descending (LAD) stent implantation and angioplasty balloon insertion. Normal ow condition was obtained with balloon completely de ated. Partial occlusion was induced by balloon in ation against the stent with FFR used to assess the extent of occlusion. Prospective ECG-triggered partial scan images were acquired at end systole (45% R-R) using a multi-detector CT (MDCT) scanner. Images were reconstructed using FBP and a hybrid iterative reconstruction (iDose4, Philips Healthcare). Processing included: beam hardening (BH) correction, registration of image volumes using 3D cubic B-spline normalized mutual-information, and spatio-temporal bilateral ltering to reduce partial scan artifacts and noise variation. Absolute blood ow was calculated with a deconvolutionbased approach using singular value decomposition (SVD). Arterial input function was estimated from the left ventricle (LV) cavity. Regions of interest (ROIs) were identi ed in healthy and ischemic myocardium and compared in normal and occluded conditions. Under-perfusion was detected in the correct LAD territory and ow reduction agreed well with FFR measurements. Flow was reduced, on average, in LAD territories by 54%.

  10. Restricted Three-Body Dynamics and Morphologies of Early Novae Shells and their Spectral Signatures

    Science.gov (United States)

    Lynch, D. K.; Mazuk, S.; Campbell, E.; Venturini, C. C.

    2003-08-01

    The goal of this work is to calculate emission line profiles of classical novae systems for comparison to line profiles we observe in an attempt to deduce geometrical and dynamical properties of the system from the spectra. The material ejected by the thermonuclear runaway on the surface of the white dwarf (WD) is modeled as a large number of massless particles that are launched instantaneously and move ballistically thereafter. Each particle's position is propagated independently in three-dimensional space with a particle's track terminating if it impacts the WD or the secondary. Predicted line profiles, assuming an optically thin shell, are generated by computing a histogram of the number of particles in radial velocity space for a given observing projection. At high ejection velocities, a nearly spherical shell is produced. At ejection speeds near the WD's escape velocity, very complicated and ever changing geometries result and the material remains close to the system's barycenter. We present animations of computer simulations of novae shell development and the associated line profiles. This work supported by The Aerospace Corporation's Independent Research and Development program and by the US Air Force Space and Missile Systems Center through the Mission Oriented Investigation and Experimentation program, under contract F4701-00-C-0009 with the US Air Force.

  11. Disparity changes in 370 Ma Devonian fossils: the signature of ecological dynamics?

    Science.gov (United States)

    Girard, Catherine; Renaud, Sabrina

    2012-01-01

    Early periods in Earth's history have seen a progressive increase in complexity of the ecosystems, but also dramatic crises decimating the biosphere. Such patterns are usually considered as large-scale changes among supra-specific groups, including morphological novelties, radiation, and extinctions. Nevertheless, in the same time, each species evolved by the way of micro-evolutionary processes, extended over millions of years into the evolution of lineages. How these two evolutionary scales interacted is a challenging issue because this requires bridging a gap between scales of observation and processes. The present study aims at transferring a typical macro-evolutionary approach, namely disparity analysis, to the study of fine-scale evolutionary variations in order to decipher what processes actually drove the dynamics of diversity at a micro-evolutionary level. The Late Frasnian to Late Famennian period was selected because it is punctuated by two major macro-evolutionary crises, as well as a progressive diversification of marine ecosystem. Disparity was estimated through this period on conodonts, tooth-like fossil remains of small eel-like predators that were part of the nektonic fauna. The study was focused on the emblematic genus of the period, Palmatolepis. Strikingly, both crises affected an already impoverished Palmatolepis disparity, increasing risks of random extinction. The major disparity signal rather emerged as a cycle of increase and decrease in disparity during the inter-crises period. The diversification shortly followed the first crisis and might correspond to an opportunistic occupation of empty ecological niche. The subsequent oriented shrinking in the morphospace occupation suggests that the ecological space available to Palmatolepis decreased through time, due to a combination of factors: deteriorating climate, expansion of competitors and predators. Disparity changes of Palmatolepis thus reflect changes in the structure of the ecological

  12. Disparity changes in 370 Ma Devonian fossils: the signature of ecological dynamics?

    Directory of Open Access Journals (Sweden)

    Catherine Girard

    Full Text Available Early periods in Earth's history have seen a progressive increase in complexity of the ecosystems, but also dramatic crises decimating the biosphere. Such patterns are usually considered as large-scale changes among supra-specific groups, including morphological novelties, radiation, and extinctions. Nevertheless, in the same time, each species evolved by the way of micro-evolutionary processes, extended over millions of years into the evolution of lineages. How these two evolutionary scales interacted is a challenging issue because this requires bridging a gap between scales of observation and processes. The present study aims at transferring a typical macro-evolutionary approach, namely disparity analysis, to the study of fine-scale evolutionary variations in order to decipher what processes actually drove the dynamics of diversity at a micro-evolutionary level. The Late Frasnian to Late Famennian period was selected because it is punctuated by two major macro-evolutionary crises, as well as a progressive diversification of marine ecosystem. Disparity was estimated through this period on conodonts, tooth-like fossil remains of small eel-like predators that were part of the nektonic fauna. The study was focused on the emblematic genus of the period, Palmatolepis. Strikingly, both crises affected an already impoverished Palmatolepis disparity, increasing risks of random extinction. The major disparity signal rather emerged as a cycle of increase and decrease in disparity during the inter-crises period. The diversification shortly followed the first crisis and might correspond to an opportunistic occupation of empty ecological niche. The subsequent oriented shrinking in the morphospace occupation suggests that the ecological space available to Palmatolepis decreased through time, due to a combination of factors: deteriorating climate, expansion of competitors and predators. Disparity changes of Palmatolepis thus reflect changes in the structure

  13. Verification of Compressible and Incompressible Computational Fluid Dynamics Codes and Residual-based Mesh Adaptation

    Science.gov (United States)

    Choudhary, Aniruddha

    Code verifition is the process of ensuring, to the degree possible, that there are no algorithm deficiencies and coding mistakes (bugs) in a scientific computing simulation. In this work, techniques are presented for performing code verifition of boundary conditions commonly used in compressible and incompressible Computational Fluid Dynamics (CFD) codes. Using a compressible CFD code, this study assesses the subsonic in flow (isentropic and fixed-mass), subsonic out ow, supersonic out ow, no-slip wall (adiabatic and isothermal), and inviscid slip-wall. The use of simplified curved surfaces is proposed for easier generation of manufactured solutions during the verifition of certain boundary conditions involving many constraints. To perform rigorous code verifition, general grids with mixed cell types at the verified boundary are used. A novel approach is introduced to determine manufactured solutions for boundary condition verifition when the velocity-field is constrained to be divergence-free during the simulation in an incompressible CFD code. Order of accuracy testing using the Method of Manufactured Solutions (MMS) is employed here for code verifition of the major components of an open-source, multiphase ow code - MFIX. The presence of two-phase governing equations and a modified SIMPLE-based algorithm requiring divergence-free flows makes the selection of manufactured solutions more involved than for single-phase, compressible flows. Code verifition is performed here on 2D and 3D, uniform and stretched meshes for incompressible, steady and unsteady, single-phase and two-phase flows using the two-fluid model of MFIX. In a CFD simulation, truncation error (TE) is the difference between the continuous governing equation and its discrete approximation. Since TE can be shown to be the local source term for the discretization error, TE is proposed as the criterion for determining which regions of the computational mesh should be refined/coarsened. For mesh

  14. Parton Theory of Magnetic Polarons: Mesonic Resonances and Signatures in Dynamics

    Science.gov (United States)

    Grusdt, F.; Kánasz-Nagy, M.; Bohrdt, A.; Chiu, C. S.; Ji, G.; Greiner, M.; Greif, D.; Demler, E.

    2018-01-01

    When a mobile hole is moving in an antiferromagnet it distorts the surrounding Néel order and forms a magnetic polaron. Such interplay between hole motion and antiferromagnetism is believed to be at the heart of high-temperature superconductivity in cuprates. In this article, we study a single hole described by the t -Jz model with Ising interactions between the spins in two dimensions. This situation can be experimentally realized in quantum gas microscopes with Mott insulators of Rydberg-dressed bosons or fermions, or using polar molecules. We work at strong couplings, where hole hopping is much larger than couplings between the spins. In this regime we find strong theoretical evidence that magnetic polarons can be understood as bound states of two partons, a spinon and a holon carrying spin and charge quantum numbers, respectively. Starting from first principles, we introduce a microscopic parton description which is benchmarked by comparison with results from advanced numerical simulations. Using this parton theory, we predict a series of excited states that are invisible in the spectral function and correspond to rotational excitations of the spinon-holon pair. This is reminiscent of mesonic resonances observed in high-energy physics, which can be understood as rotating quark-antiquark pairs carrying orbital angular momentum. Moreover, we apply the strong-coupling parton theory to study far-from-equilibrium dynamics of magnetic polarons observable in current experiments with ultracold atoms. Our work supports earlier ideas that partons in a confining phase of matter represent a useful paradigm in condensed-matter physics and in the context of high-temperature superconductivity in particular. While direct observations of spinons and holons in real space are impossible in traditional solid-state experiments, quantum gas microscopes provide a new experimental toolbox. We show that, using this platform, direct observations of partons in and out of equilibrium are

  15. TU-CD-304-03: Dosimetric Verification and Preliminary Comparison of Dynamic Wave Arc for SBRT Treatments

    International Nuclear Information System (INIS)

    Burghelea, M; Poels, K; Gevaert, T; Tournel, K; Dhont, J; De Ridder, M; Verellen, D; Hung, C; Eriksson, K; Simon, V

    2015-01-01

    Purpose: To evaluate the potential dosimetric benefits and verify the delivery accuracy of Dynamic Wave Arc, a novel treatment delivery approach for the Vero SBRT system. Methods: Dynamic Wave Arc (DWA) combines simultaneous movement of gantry/ring with inverse planning optimization, resulting in an uninterrupted non-coplanar arc delivery technique. Thirteen SBRT complex cases previously treated with 8–10 conformal static beams (CRT) were evaluated in this study. Eight primary centrally-located NSCLC (prescription dose 4×12Gy or 8×7.5Gy) and five oligometastatic cases (2×2 lesions, 10×5Gy) were selected. DWA and coplanar VMAT plans, partially with dual arcs, were generated for each patient using identical objective functions for target volumes and OARs on the same TPS (RayStation, RaySearch Laboratories). Dosimetric differences and delivery time among these three planning schemes were evaluated. The DWA delivery accuracy was assessed using the Delta4 diode array phantom (ScandiDos AB). The gamma analysis was performed with the 3%/3mm dose and distance-to-agreement criteria. Results: The target conformity for CRT, VMAT and DWA were 0.95±0.07, 0.96±0.04 and 0.97±0.04, while the low dose spillage gradient were 5.52±1.36, 5.44±1.11, and 5.09±0.98 respectively. Overall, the bronchus, esophagus and spinal cord maximum doses were similar between VMAT and DWA, but highly reduced compared with CRT. For the lung cases, the mean dose and V20Gy were lower for the arc techniques compares with CRT, while for the liver cases, the mean dose and the V30Gy presented slightly higher values. The average delivery time of VMAT and DWA were 2.46±1.10 min and 4.25±1.67 min, VMAT presenting shorter treatment time in all cases. The DWA dosimetric verification presented an average gamma index passing rate of 95.73±1.54% (range 94.2%–99.8%). Conclusion: Our preliminary data indicated that the DWA is deliverable with clinically acceptable accuracy and has the potential to

  16. Revocable identity-based proxy re-signature against signing key exposure

    Science.gov (United States)

    Ma, Tingchun; Wang, Jinli; Wang, Caifen

    2018-01-01

    Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification. PMID:29579125

  17. Extending the similarity-based XML multicast approach with digital signatures

    DEFF Research Database (Denmark)

    Azzini, Antonia; Marrara, Stefania; Jensen, Meiko

    2009-01-01

    This paper investigates the interplay between similarity-based SOAP message aggregation and digital signature application. An overview on the approaches resulting from the different orders for the tasks of signature application, verification, similarity aggregation and splitting is provided. Depe...

  18. Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading.

    CSIR Research Space (South Africa)

    Joughin, WC

    2002-04-01

    Full Text Available Committee Final Project Report Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading W.C. Joughin, J.L. Human and P.J. Terbrugge Research agency...) Drycrete with 2% by mass a) Drycrete with 0.5% by mass a) Drycrete Slimes tailings a) Drycrete with no fibre, x 40mm long Dramix steel x 40mm long Polypropelene with 0.5% by mass x 40mm 60 to 70mm thick fibre, 60 to 70mm thick...

  19. Nonequilibrium Mixed Quantum-Classical simulations of Hydrogen-bond Structure and Dynamics in Methanol-d Carbon tetrachloride liquid mixtures and its spectroscopic signature

    Science.gov (United States)

    Kwac, Kijeong; Geva, Eitan

    2011-03-01

    Liquid mixtures of methanol-d and carbon tetrachloride provide attractive model systems for investigating hydrogen-bond structure and dynamics. The hydrogen-bonded methanol oligomers in these mixtures give rise to a very broad hydroxyl stretch IR band (~ 150 cm-1). We have employed mixed quantum-classical molecular dynamics simulations to study the nature of hydrogen- bond structure and dynamics in this system and its spectroscopic signature. In our simulations, the hydroxyl stretch mode is treated quantum mechanically. We have found that the absorption spectrum is highly sensitive to the type of force fields used. Obtaining absorption spectra consistent with experiment required the use of corrected polarizabile force fields and a dipole damping scheme. We have established mapping relationships between the electric field along the hydroxyl bond and the hydrogen-stretch frequency and bond length thereby reducing the computational cost dramatically to simulate the complex nonequilibrium dynamics underlying pump-probe spectra.

  20. Signature-based store checking buffer

    Science.gov (United States)

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  1. The impact of gyre dynamics on the mid-depth salinity signature of the eastern North Atlantic

    Science.gov (United States)

    Burkholder, K. C.; Lozier, M. S.

    2009-04-01

    The Mediterranean Overflow Water (MOW) is widely recognized for its role in establishing the mid-depth salinity signature of the subtropical North Atlantic. However, recent work has revealed an intermittent impact of MOW on the salinity signature of the eastern subpolar basin. This impact results from a temporally variable penetration of the northward flowing branch of the MOW past Porcupine Bank into the eastern subpolar basin. It has been shown that the salinity signature of the eastern subpolar basin, in particular the Rockall Trough, varies with the state of the North Atlantic Oscillation (NAO): during persistent periods of strong winds (high NAO index), when the subpolar front moves eastward, waters in the subpolar gyre block the northward flowing MOW, preventing its entry into the subpolar gyre. Conversely, during persistent periods of weak winds (low NAO index), the front moves westward, allowing MOW to penetrate north of Porcupine Bank and into the subpolar gyre. Here, we investigate the manner in which the spatial and temporal variability in the northward penetration of the MOW and the position of the eastern limb of the subpolar front affect the mid-depth property fields not only in the subpolar gyre, but in the subtropical gyre as well. Using approximately 55 years of historical hydrographic data and output from the 1/12° FLAME model, we analyze the temporal variability of salinity along the eastern boundary and compare this variability to the position of the subpolar front in both the observational record and the FLAME model. We conclude that when the zonal position of the subpolar front moves relatively far offshore and the MOW is able to penetrate to the north, high salinity anomalies are observed at high latitudes and low salinity anomalies are observed at low latitudes. Conversely, when the frontal position shifts to the east, the MOW (and thus, the high salinity signature) is blocked, resulting in a drop in salinity anomalies at high latitudes

  2. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  3. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  4. Radiation signatures

    International Nuclear Information System (INIS)

    McGlynn, S.P.; Varma, M.N.

    1992-01-01

    A new concept for modelling radiation risk is proposed. This concept is based on the proposal that the spectrum of molecular lesions, which we dub ''the radiation signature'', can be used to identify the quality of the causal radiation. If the proposal concerning radiation signatures can be established then, in principle, both prospective and retrospective risk determination can be assessed on an individual basis. A major goal of biophysical modelling is to relate physical events such as ionization, excitation, etc. to the production of radiation carcinogenesis. A description of the physical events is provided by track structure. The track structure is determined by radiation quality, and it can be considered to be the ''physical signature'' of the radiation. Unfortunately, the uniqueness characteristics of this signature are dissipated in biological systems in ∼10 -9 s. Nonetheless, it is our contention that this physical disturbance of the biological system eventuates later, at ∼10 0 s, in molecular lesion spectra which also characterize the causal radiation. (author)

  5. Redactable signatures for signed CDA Documents.

    Science.gov (United States)

    Wu, Zhen-Yu; Hsueh, Chih-Wen; Tsai, Cheng-Yu; Lai, Feipei; Lee, Hung-Chang; Chung, Yufang

    2012-06-01

    The Clinical Document Architecture, introduced by Health Level Seven, is a XML-based standard intending to specify the encoding, structure, and semantics of clinical documents for exchange. Since the clinical document is in XML form, its authenticity and integrity could be guaranteed by the use of the XML signature published by W3C. While a clinical document wants to conceal some personal or private information, the document needs to be redacted. It makes the signed signature of the original clinical document not be verified. The redactable signature is thus proposed to enable verification for the redacted document. Only a little research does the implementation of the redactable signature, and there still not exists an appropriate scheme for the clinical document. This paper will investigate the existing web-technologies and find a compact and applicable model to implement a suitable redactable signature for the clinical document viewer.

  6. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  7. Establishment and verification of three-dimensional dynamic model for heavy-haul train-track coupled system

    Science.gov (United States)

    Liu, Pengfei; Zhai, Wanming; Wang, Kaiyun

    2016-11-01

    For the long heavy-haul train, the basic principles of the inter-vehicle interaction and train-track dynamic interaction are analysed firstly. Based on the theories of train longitudinal dynamics and vehicle-track coupled dynamics, a three-dimensional (3-D) dynamic model of the heavy-haul train-track coupled system is established through a modularised method. Specifically, this model includes the subsystems such as the train control, the vehicle, the wheel-rail relation and the line geometries. And for the calculation of the wheel-rail interaction force under the driving or braking conditions, the large creep phenomenon that may occur within the wheel-rail contact patch is considered. For the coupler and draft gear system, the coupler forces in three directions and the coupler lateral tilt angles in curves are calculated. Then, according to the characteristics of the long heavy-haul train, an efficient solving method is developed to improve the computational efficiency for such a large system. Some basic principles which should be followed in order to meet the requirement of calculation accuracy are determined. Finally, the 3-D train-track coupled model is verified by comparing the calculated results with the running test results. It is indicated that the proposed dynamic model could simulate the dynamic performance of the heavy-haul train well.

  8. Spatiotemporal dynamics of the brain at rest--exploring EEG microstates as electrophysiological signatures of BOLD resting state networks.

    Science.gov (United States)

    Yuan, Han; Zotev, Vadim; Phillips, Raquel; Drevets, Wayne C; Bodurka, Jerzy

    2012-05-01

    Neuroimaging research suggests that the resting cerebral physiology is characterized by complex patterns of neuronal activity in widely distributed functional networks. As studied using functional magnetic resonance imaging (fMRI) of the blood-oxygenation-level dependent (BOLD) signal, the resting brain activity is associated with slowly fluctuating hemodynamic signals (~10s). More recently, multimodal functional imaging studies involving simultaneous acquisition of BOLD-fMRI and electroencephalography (EEG) data have suggested that the relatively slow hemodynamic fluctuations of some resting state networks (RSNs) evinced in the BOLD data are related to much faster (~100 ms) transient brain states reflected in EEG signals, that are referred to as "microstates". To further elucidate the relationship between microstates and RSNs, we developed a fully data-driven approach that combines information from simultaneously recorded, high-density EEG and BOLD-fMRI data. Using independent component analysis (ICA) of the combined EEG and fMRI data, we identified thirteen microstates and ten RSNs that are organized independently in their temporal and spatial characteristics, respectively. We hypothesized that the intrinsic brain networks that are active at rest would be reflected in both the EEG data and the fMRI data. To test this hypothesis, the rapid fluctuations associated with each microstate were correlated with the BOLD-fMRI signal associated with each RSN. We found that each RSN was characterized further by a specific electrophysiological signature involving from one to a combination of several microstates. Moreover, by comparing the time course of EEG microstates to that of the whole-brain BOLD signal, on a multi-subject group level, we unraveled for the first time a set of microstate-associated networks that correspond to a range of previously described RSNs, including visual, sensorimotor, auditory, attention, frontal, visceromotor and default mode networks. These

  9. Verification of new model for calculation of critical strain for the initialization of dynamic recrystallization using laboratory rolling

    Directory of Open Access Journals (Sweden)

    R. Fabík

    2009-10-01

    Full Text Available This paper presents a new model for calculation of critical strain for initialization of dynamic recrystallization. The new model reflects the history of forming in the deformation zone during rolling. In this region of restricted deformation, the strain rate curve for the surface of the strip exhibits two peaks. These are the two reasons why the onset of dynamic recrystallization DRX near the surface of the rolled part occurs later than in theory during strip rolling. The present model had been used in a program for simulation of forming processes with the aid of FEM and a comparison between the physical experiment and a mathematical model had been drawn.

  10. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  11. Quantum signature scheme for known quantum messages

    International Nuclear Information System (INIS)

    Kim, Taewan; Lee, Hyang-Sook

    2015-01-01

    When we want to sign a quantum message that we create, we can use arbitrated quantum signature schemes which are possible to sign for not only known quantum messages but also unknown quantum messages. However, since the arbitrated quantum signature schemes need the help of a trusted arbitrator in each verification of the signature, it is known that the schemes are not convenient in practical use. If we consider only known quantum messages such as the above situation, there can exist a quantum signature scheme with more efficient structure. In this paper, we present a new quantum signature scheme for known quantum messages without the help of an arbitrator. Differing from arbitrated quantum signature schemes based on the quantum one-time pad with the symmetric key, since our scheme is based on quantum public-key cryptosystems, the validity of the signature can be verified by a receiver without the help of an arbitrator. Moreover, we show that our scheme provides the functions of quantum message integrity, user authentication and non-repudiation of the origin as in digital signature schemes. (paper)

  12. Molecular signatures for the dynamic process of establishing intestinal host-microbial homeostasis: potential for disease diagnostics?

    NARCIS (Netherlands)

    Aidy, El S.F.; Kleerebezem, M.

    2013-01-01

    Purpose of review: The dynamic interplay of the intestinal microbiota and host has been the focus of many studies because of its impact on the health status in human life. Recent reports on the time-resolved immune and metabolic interactions between the host and microbiota, as well as the molecular

  13. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables...... to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow...

  14. Verification of a 2 kWe Closed-Brayton-Cycle Power Conversion System Mechanical Dynamics Model

    Science.gov (United States)

    Ludwiczak, Damian R.; Le, Dzu K.; McNelis, Anne M.; Yu, Albert C.; Samorezov, Sergey; Hervol, Dave S.

    2005-01-01

    Vibration test data from an operating 2 kWe closed-Brayton-cycle (CBC) power conversion system (PCS) located at the NASA Glenn Research Center was used for a comparison with a dynamic disturbance model of the same unit. This effort was performed to show that a dynamic disturbance model of a CBC PCS can be developed that can accurately predict the torque and vibration disturbance fields of such class of rotating machinery. The ability to accurately predict these disturbance fields is required before such hardware can be confidently integrated onto a spacecraft mission. Accurate predictions of CBC disturbance fields will be used for spacecraft control/structure interaction analyses and for understanding the vibration disturbances affecting the scientific instrumentation onboard. This paper discusses how test cell data measurements for the 2 kWe CBC PCS were obtained, the development of a dynamic disturbance model used to predict the transient torque and steady state vibration fields of the same unit, and a comparison of the two sets of data.

  15. Co-evolving Hydroclimatic Signatures and Diarrheal Disease Dynamics in Bangladesh: Implications for Water Management and Public Health

    Science.gov (United States)

    Akanda, A. S.; Hasan, M. A.; Jutla, A.; Islam, A. K. M. S.; Huq, A.; Colwell, R. R.

    2014-12-01

    The Bengal Delta region in South Asia is well-known for its endemicity to diarrheal diseases and high population vulnerability to natural calamities and diarrheal and other water-related disease epidemics. The diarrheal disease outbreaks in the coastal and inland floodplains, such as cholera, rotavirus, and dysentery, show distinct seasonal peaks and spatial signatures in their origin and progression. The last three decades of surveillance data also shows a drastic increase of diarrheal incidence in both urban and peri-urban areas, even after correcting for population trends. Recent research has shown increased roles of hydroclimatic events such as droughts and floods on the seasonal to interannual characteristics, as well as the coastal and inland progression patterns of disease outbreaks. However, the mechanisms behind these phenomena, especially how the changes in the regional climatic and hydrologic processes contributed to the spatio-temporal trends of disease outbreaks are not fully understood. Here, we analyze the last 30-years of diarrheal incidence in Dhaka and regional surveillance centers with changes in climatic or anthropogenic forcings: regional hydrology, flooding, water usage, population growth and density in urban settlements, as well as shifting climate patterns and frequency of natural disasters. We use a set of CMIP5 (Coupled Model Intercomparison Project Phase 5) model projections of regional precipitation and temperature patterns in Bengal Delta to develop scenarios of diarrheal disease projections with spatial (coastal and inland) and temporal (dry vs wet) comparisons. Our preliminary results shows that growing water scarcity in the dry season, increasing salinity in coastal areas, and lack of sustainable water and sanitation infrastructure for urban settlements have increased endemicity of cholera outbreaks in spring, while record flood events, limited stormwater drainage and sanitation, and more intensive monsoon has contributed to

  16. Cross-code gyrokinetic verification and benchmark on the linear collisionless dynamics of the geodesic acoustic mode

    Science.gov (United States)

    Biancalani, A.; Bottino, A.; Ehrlacher, C.; Grandgirard, V.; Merlo, G.; Novikau, I.; Qiu, Z.; Sonnendrücker, E.; Garbet, X.; Görler, T.; Leerink, S.; Palermo, F.; Zarzoso, D.

    2017-06-01

    The linear properties of the geodesic acoustic modes (GAMs) in tokamaks are investigated by means of the comparison of analytical theory and gyrokinetic numerical simulations. The dependence on the value of the safety factor, finite-orbit-width of the ions in relation to the radial mode width, magnetic-flux-surface shaping, and electron/ion mass ratio are considered. Nonuniformities in the plasma profiles (such as density, temperature, and safety factor), electro-magnetic effects, collisions, and the presence of minority species are neglected. Also, only linear simulations are considered, focusing on the local dynamics. We use three different gyrokinetic codes: the Lagrangian (particle-in-cell) code ORB5, the Eulerian code GENE, and semi-Lagrangian code GYSELA. One of the main aims of this paper is to provide a detailed comparison of the numerical results and analytical theory, in the regimes where this is possible. This helps understanding better the behavior of the linear GAM dynamics in these different regimes, the behavior of the codes, which is crucial in the view of a future work where more physics is present, and the regimes of validity of each specific analytical dispersion relation.

  17. Mechanism for verification of mismatched and homoduplex DNAs by nucleotides-bound MutS analyzed by molecular dynamics simulations.

    Science.gov (United States)

    Ishida, Hisashi; Matsumoto, Atsushi

    2016-09-01

    In order to understand how MutS recognizes mismatched DNA and induces the reaction of DNA repair using ATP, the dynamics of the complexes of MutS (bound to the ADP and ATP nucleotides, or not) and DNA (with mismatched and matched base-pairs) were investigated using molecular dynamics simulations. As for DNA, the structure of the base-pairs of the homoduplex DNA which interacted with the DNA recognition site of MutS was intermittently disturbed, indicating that the homoduplex DNA was unstable. As for MutS, the disordered loops in the ATPase domains, which are considered to be necessary for the induction of DNA repair, were close to (away from) the nucleotide-binding sites in the ATPase domains when the nucleotides were (not) bound to MutS. This indicates that the ATPase domains changed their structural stability upon ATP binding using the disordered loop. Conformational analysis by principal component analysis showed that the nucleotide binding changed modes which have structurally solid ATPase domains and the large bending motion of the DNA from higher to lower frequencies. In the MutS-mismatched DNA complex bound to two nucleotides, the bending motion of the DNA at low frequency modes may play a role in triggering the formation of the sliding clamp for the following DNA-repair reaction step. Moreover, MM-PBSA/GBSA showed that the MutS-homoduplex DNA complex bound to two nucleotides was unstable because of the unfavorable interactions between MutS and DNA. This would trigger the ATP hydrolysis or separation of MutS and DNA to continue searching for mismatch base-pairs. Proteins 2016; 84:1287-1303. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Future directions of nuclear verification

    International Nuclear Information System (INIS)

    Blix, H.

    1997-01-01

    Future directions of nuclear verification are discussed including the following topics: verification of non-proliferation commitments; practicalities of strengthening safeguards; new tasks in nuclear verification

  19. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    Science.gov (United States)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  20. The Lipopolysaccharide-Induced Metabolome Signature in Arabidopsis thaliana Reveals Dynamic Reprogramming of Phytoalexin and Phytoanticipin Pathways.

    Science.gov (United States)

    Finnegan, Tarryn; Steenkamp, Paul A; Piater, Lizelle A; Dubery, Ian A

    Lipopolysaccharides (LPSs), as MAMP molecules, trigger the activation of signal transduction pathways involved in defence. Currently, plant metabolomics is providing new dimensions into understanding the intracellular adaptive responses to external stimuli. The effect of LPS on the metabolomes of Arabidopsis thaliana cells and leaf tissue was investigated over a 24 h period. Cellular metabolites and those secreted into the medium were extracted with methanol and liquid chromatography coupled to mass spectrometry was used for quantitative and qualitative analyses. Multivariate statistical data analyses were used to extract interpretable information from the generated multidimensional LC-MS data. The results show that LPS perception triggered differential changes in the metabolomes of cells and leaves, leading to variation in the biosynthesis of specialised secondary metabolites. Time-dependent changes in metabolite profiles were observed and biomarkers associated with the LPS-induced response were tentatively identified. These include the phytohormones salicylic acid and jasmonic acid, and also the associated methyl esters and sugar conjugates. The induced defensive state resulted in increases in indole-and other glucosinolates, indole derivatives, camalexin as well as cinnamic acid derivatives and other phenylpropanoids. These annotated metabolites indicate dynamic reprogramming of metabolic pathways that are functionally related towards creating an enhanced defensive capacity. The results reveal new insights into the mode of action of LPS as an activator of plant innate immunity, broadens knowledge about the defence metabolite pathways involved in Arabidopsis responses to LPS, and identifies specialised metabolites of functional importance that can be employed to enhance immunity against pathogen infection.

  1. Un système de vérification de signature manuscrite en ligne basé ...

    African Journals Online (AJOL)

    Administrateur

    online handwritten signature verification system. We model the handwritten signature by an analytical approach based on the Empirical Mode Decomposition (EMD). The organized system is provided with a training module and a base of signatures. The implemented evaluation protocol points out the interest of the adopted ...

  2. The Lipopolysaccharide-Induced Metabolome Signature in Arabidopsis thaliana Reveals Dynamic Reprogramming of Phytoalexin and Phytoanticipin Pathways.

    Directory of Open Access Journals (Sweden)

    Tarryn Finnegan

    Full Text Available Lipopolysaccharides (LPSs, as MAMP molecules, trigger the activation of signal transduction pathways involved in defence. Currently, plant metabolomics is providing new dimensions into understanding the intracellular adaptive responses to external stimuli. The effect of LPS on the metabolomes of Arabidopsis thaliana cells and leaf tissue was investigated over a 24 h period. Cellular metabolites and those secreted into the medium were extracted with methanol and liquid chromatography coupled to mass spectrometry was used for quantitative and qualitative analyses. Multivariate statistical data analyses were used to extract interpretable information from the generated multidimensional LC-MS data. The results show that LPS perception triggered differential changes in the metabolomes of cells and leaves, leading to variation in the biosynthesis of specialised secondary metabolites. Time-dependent changes in metabolite profiles were observed and biomarkers associated with the LPS-induced response were tentatively identified. These include the phytohormones salicylic acid and jasmonic acid, and also the associated methyl esters and sugar conjugates. The induced defensive state resulted in increases in indole-and other glucosinolates, indole derivatives, camalexin as well as cinnamic acid derivatives and other phenylpropanoids. These annotated metabolites indicate dynamic reprogramming of metabolic pathways that are functionally related towards creating an enhanced defensive capacity. The results reveal new insights into the mode of action of LPS as an activator of plant innate immunity, broadens knowledge about the defence metabolite pathways involved in Arabidopsis responses to LPS, and identifies specialised metabolites of functional importance that can be employed to enhance immunity against pathogen infection.

  3. Quantum blind dual-signature scheme without arbitrator

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  4. Quantum blind dual-signature scheme without arbitrator

    International Nuclear Information System (INIS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-01-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology. (paper)

  5. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  6. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  7. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  8. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  9. ΔI = 1 Signature Splitting in Signature Partners of Odd Mass Superdeformed Nuclei

    Directory of Open Access Journals (Sweden)

    Khalaf A. M.,

    2013-07-01

    Full Text Available The spins, transition energies, rotational frequencies, kinematic and dynamic moment of inertia of rotational bands of signature partners pairs of odd–A superdeformed bands in A190 region were calculated by proposing a simple model based on collective rotational model. Simulated search program was written to determine the model parameters. The calculated results agree with experimental data for fourteen signature partner pairs in Hg/Tl/Pb/Bi/nuclei. We investigated the ∆I=1 signature splitting by extracted the difference between the average transitions I+2 ! I and I ! I-2 energies in one band and the transition I+1 ! I-1 energies in its signature partner. Most of the signature partners in this region show large amplitude staggering. The signature splitting has the effect of increasing dynamical moment of inertia J 2 for favored band and decreasing J 2 for the unfavored band.

  10. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  11. Review and Analysis of Cryptographic Schemes Implementing Threshold Signature

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-03-01

    Full Text Available This work is devoted to the study of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, ellipt ic curves and bilinear pairings were investigated. Different methods of generation and verification of threshold signatures were explored, e.g. used in a mobile agents, Internet banking and e-currency. The significance of the work is determined by the reduction of the level of counterfeit electronic documents, signed by certain group of users.

  12. Low Signature Tent Structures

    National Research Council Canada - National Science Library

    Cox, Randy

    1998-01-01

    .... Modeling shows that visual, near infrared, thermal, and radar signatures should be reduced when compared to other current tent designs. A brief treatise on the role of tent signatures and their results is included.

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  15. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  16. Continuous-variable quantum homomorphic signature

    Science.gov (United States)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  17. Managing the Verification Trajectory

    NARCIS (Netherlands)

    Ruys, T.C.; Brinksma, Hendrik

    In this paper we take a closer look at the automated analysis of designs, in particular of verification by model checking. Model checking tools are increasingly being used for the verification of real-life systems in an industrial context. In addition to ongoing research aimed at curbing the

  18. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  19. Breaking of the z-signature symmetry in the Hartree-Fock-Bogoliubov formalism. Applications to the dynamics of deformed nuclei

    International Nuclear Information System (INIS)

    Dancer, H.

    2000-01-01

    This work concerns the extension of the application domain of microscopic calculations in nuclear structure to phenomena breaking the z-signature symmetry. The approach followed consists in solving the many-body problem by means of the mean-field approximation using the Hartree-Fock-Bogoliubov method. By employing the Gogny nucleon-nucleon interaction, mean-field effects as well as pairing correlations are calculated in a self-consistent way. In order to simplify the iterative resolution of the associated system of non-linear equations, the remaining symmetries of the system are explicitly taken into account. In all studies made up to now, the z-signature symmetry was imposed. The solutions of the HFB problem were eigenstates of the z-signature symmetry, a symmetry related to a rotation of π around the z axis. However, many physical phenomena, as rotational bands based on individual excitations and magnetic dipole collective excitations, break this symmetry. In this work the formalism needed to take into account this symmetry breaking is developed and results are given for the two above phenomena. Theoretical rotational bands are in good agreement with experimental data. Band-heads excitation energies as well as moment of inertia are well reproduced. Concerning magnetic dipole excitations, we show that the low-lying l + states experimentally observed, are not collective scissor excitations, the latter being found at high excitation energy. An interpretation in terms of rather non-collective hexadecapole excitations coupled to individual excitations is proposed. (author) [fr

  20. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  1. Unconditionally Secure Quantum Signatures

    Directory of Open Access Journals (Sweden)

    Ryan Amiri

    2015-08-01

    Full Text Available Signature schemes, proposed in 1976 by Diffie and Hellman, have become ubiquitous across modern communications. They allow for the exchange of messages from one sender to multiple recipients, with the guarantees that messages cannot be forged or tampered with and that messages also can be forwarded from one recipient to another without compromising their validity. Signatures are different from, but no less important than encryption, which ensures the privacy of a message. Commonly used signature protocols—signatures based on the Rivest–Adleman–Shamir (RSA algorithm, the digital signature algorithm (DSA, and the elliptic curve digital signature algorithm (ECDSA—are only computationally secure, similar to public key encryption methods. In fact, since these rely on the difficulty of finding discrete logarithms or factoring large primes, it is known that they will become completely insecure with the emergence of quantum computers. We may therefore see a shift towards signature protocols that will remain secure even in a post-quantum world. Ideally, such schemes would provide unconditional or information-theoretic security. In this paper, we aim to provide an accessible and comprehensive review of existing unconditionally securesecure signature schemes for signing classical messages, with a focus on unconditionally secure quantum signature schemes.

  2. Radar Signature Calculation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The calculation, analysis, and visualization of the spatially extended radar signatures of complex objects such as ships in a sea multipath environment and...

  3. Video-based fingerprint verification.

    Science.gov (United States)

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  4. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  5. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  6. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  7. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  8. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  9. Dynamic Behavior of a SCARA Robot by using N-E Method for a Straight Line and Simulation of Motion by using Solidworks and Verification by Matlab/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2014-05-01

    Full Text Available SCARA (Selective Compliant Assembly Robot Arm robot of serial architecture is widely used in assembly operations and operations "pick-place", it has been shown that use of robots improves the accuracy of assembly, and saves assembly time and cost as well. The most important condition for the choice of this kind of robot is the dynamic behavior for a given path, no closed solution for the dynamics of this important robot has been reported. This paper presents the study of the kinematics (forward and inverse by using D-H notation and the dynamics of SCARA robot by using N-E methods. A computer code is developed for trajectory generation by using inverse kinematics, and calculates the variations of the torques of the links for a straight line (path rest to rest between two positions for operation "pick-place". SCARA robot is constructed to achieve “pick-place» operation using SolidWorks software. And verification by Matlab/Simulink. The results of simulations were discussed. An agreement between the two softwares is certainly obtained herein

  10. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  11. Digital Signature Management.

    Science.gov (United States)

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  12. Electronic health records: what does your signature signify?

    Directory of Open Access Journals (Sweden)

    Victoroff MD Michael S

    2012-08-01

    Full Text Available Abstract Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information.

  13. Hybrid Enrichment Verification Array: Module Characterization Studies

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  14. UV Signature Mutations †

    Science.gov (United States)

    2014-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutations – deviations from a random distribution of base changes to create a pattern typical of that mutagen – and the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ≥60% of mutations are C→T at a dipyrimidine site, with ≥5% CC→TT. Other canonical features such as a bias for mutations on the non-transcribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; non-signature mutations induced by UV may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  15. Machine Fault Signature Analysis

    Directory of Open Access Journals (Sweden)

    Pratesh Jayaswal

    2008-01-01

    Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.

  16. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  17. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  18. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  19. Advanced Missile Signature Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Missile Signature Center (AMSC) is a national facility supporting the Missile Defense Agency (MDA) and other DoD programs and customers with analysis,...

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  2. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  3. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  4. Software verification and validation plan for the GWSCREEN code

    International Nuclear Information System (INIS)

    Rood, A.S.

    1993-05-01

    The purpose of this Software Verification and Validation Plan (SVVP) is to prescribe steps necessary to verify and validate the GWSCREEN code, version 2.0 to Quality Level B standards. GWSCREEN output is to be verified and validated by comparison with hand calculations, and by output from other Quality Level B computer codes. Verification and validation will also entail performing static and dynamic tests on the code using several analysis tools. This approach is consistent with guidance in the ANSI/ANS-10.4-1987, open-quotes Guidelines for Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry.close quotes

  5. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  6. Network-based Arbitrated Quantum Signature Scheme with Graph State

    Science.gov (United States)

    Ma, Hongling; Li, Fei; Mao, Ningyi; Wang, Yijun; Guo, Ying

    2017-08-01

    Implementing an arbitrated quantum signature(QAS) through complex networks is an interesting cryptography technology in the literature. In this paper, we propose an arbitrated quantum signature for the multi-user-involved networks, whose topological structures are established by the encoded graph state. The determinative transmission of the shared keys, is enabled by the appropriate stabilizers performed on the graph state. The implementation of this scheme depends on the deterministic distribution of the multi-user-shared graph state on which the encoded message can be processed in signing and verifying phases. There are four parties involved, the signatory Alice, the verifier Bob, the arbitrator Trent and Dealer who assists the legal participants in the signature generation and verification. The security is guaranteed by the entanglement of the encoded graph state which is cooperatively prepared by legal participants in complex quantum networks.

  7. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  8. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  9. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    the verification and "lessons learned " from the semantic and technical issues we discovered as we implemented the approach. 15. SUBJECT TERMS...any programming language at use at CAA for modeling or other data analysis applications, to include R, Python , Scheme, Common Lisp, Julia, Mathematica

  10. Signature of biased range in the non-dynamical Chern-Simons modified gravity and its measurements with satellite-satellite tracking missions: theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Qiang, Li-E [Chang' an University, Department of Geophysics, College of Geology Engineering and Geomatics, Xi' an (China); Xu, Peng [Chinese Academy of Sciences, Academy of Mathematics and Systems Science, Beijing (China)

    2015-08-15

    Having great accuracy in the range and range rate measurements, the GRACE mission and the planed GRACE follow on mission can in principle be employed to place strong constraints on certain relativistic gravitational theories. In this paper, we work out the range observable of the non-dynamical Chern-Simons modified gravity for the satellite-to-satellite tracking (SST) measurements. We find out that a characteristic time accumulating range signal appears in non-dynamical Chern-Simons gravity, which has no analogue found in the standard parity-preserving metric theories of gravity. The magnitude of this Chern-Simons range signal will reach a few times of χ cm for each free flight of these SST missions, here χ is the dimensionless post-Newtonian parameter of the non-dynamical Chern-Simons theory. Therefore, with the 12 years data of the GRACE mission, one expects that the mass scale M{sub CS} = (4ℎc)/(χa) of the non-dynamical Chern-Simons gravity could be constrained to be larger than 1.9 x 10.9 eV. For the GRACE FO mission that scheduled to be launched in 2017, the much stronger bound that M{sub CS} ≥ 5 x 10{sup -7} eV is expected. (orig.)

  11. Signature of biased range in the non-dynamical Chern-Simons modified gravity and its measurements with satellite-satellite tracking missions: theoretical studies

    International Nuclear Information System (INIS)

    Qiang, Li-E; Xu, Peng

    2015-01-01

    Having great accuracy in the range and range rate measurements, the GRACE mission and the planed GRACE follow on mission can in principle be employed to place strong constraints on certain relativistic gravitational theories. In this paper, we work out the range observable of the non-dynamical Chern-Simons modified gravity for the satellite-to-satellite tracking (SST) measurements. We find out that a characteristic time accumulating range signal appears in non-dynamical Chern-Simons gravity, which has no analogue found in the standard parity-preserving metric theories of gravity. The magnitude of this Chern-Simons range signal will reach a few times of χ cm for each free flight of these SST missions, here χ is the dimensionless post-Newtonian parameter of the non-dynamical Chern-Simons theory. Therefore, with the 12 years data of the GRACE mission, one expects that the mass scale M CS = (4ℎc)/(χa) of the non-dynamical Chern-Simons gravity could be constrained to be larger than 1.9 x 10.9 eV. For the GRACE FO mission that scheduled to be launched in 2017, the much stronger bound that M CS ≥ 5 x 10 -7 eV is expected. (orig.)

  12. Signature of biased range in the non-dynamical Chern-Simons modified gravity and its measurements with satellite-satellite tracking missions: theoretical studies

    Science.gov (United States)

    Qiang, Li-E.; Xu, Peng

    2015-08-01

    Having great accuracy in the range and range rate measurements, the GRACE mission and the planed GRACE follow on mission can in principle be employed to place strong constraints on certain relativistic gravitational theories. In this paper, we work out the range observable of the non-dynamical Chern-Simons modified gravity for the satellite-to-satellite tracking (SST) measurements. We find out that a characteristic time accumulating range signal appears in non-dynamical Chern-Simons gravity, which has no analogue found in the standard parity-preserving metric theories of gravity. The magnitude of this Chern-Simons range signal will reach a few times of cm for each free flight of these SST missions, here is the dimensionless post-Newtonian parameter of the non-dynamical Chern-Simons theory. Therefore, with the 12 years data of the GRACE mission, one expects that the mass scale of the non-dynamical Chern-Simons gravity could be constrained to be larger than eV. For the GRACE FO mission that scheduled to be launched in 2017, the much stronger bound that eV is expected.

  13. Sediment transport dynamics in the Central Himalaya: assessing during monsoon the erosion processes signature in the daily suspended load of the Narayani river

    Science.gov (United States)

    Morin, Guillaume; Lavé, Jérôme; Lanord, Christian France; Prassad Gajurel, Ananta

    2017-04-01

    The evolution of mountainous landscapes is the result of competition between tectonic and erosional processes. In response to the creation of topography by tectonics, fluvial, glacial, and hillslope denudation processes erode topography, leading to rock exhumation and sediment redistribution. When trying to better document the links between climate, tectonic, or lithologic controls in mountain range evolution, a detailed understanding of the influence of each erosion process in a given environment is fundamental. At the scale of a whole mountain range, a systematic survey and monitoring of all the geomorphologic processes at work can rapidly become difficult. An alternative approach can be provided by studying the characteristics and temporal evolution of the sediments exported out of the range. In central Himalaya, the Narayani watershed presents contrasted lithologic, geochemical or isotopic signatures of the outcropping rocks as well as of the erosional processes: this particular setting allows conducting such type of approach by partly untangling the myopic vision of the spatial integration at the watershed scale. Based on the acquisition and analysis of a new dataset on the daily suspended load concentration and geochemical characteristics at the mountain outlet of one of the largest Himalayan rivers (drainage area = 30000 km2) bring several important results on Himalayan erosion, and on climatic and process controls. 1. Based on discrete depth sampling and on daily surface sampling of suspended load associated to flow characterization through ADCP measurements, we were first able to integrate sediment flux across a river cross-section and over time. We estimate for 2010 year an equivalent erosion rate of 1.8 +0.35/-0.2 mm/yr, and over the last 15 years, using past sediment load records from the DHM of Nepal, an equivalent erosion rate of 1.6 +0.3/-0.2 mm/yr. These rates are also in close agreement with the longer term ( 500 yrs) denudation rates of 1.7 mm

  14. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  15. Practical quantum digital signature

    Science.gov (United States)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  16. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    Science.gov (United States)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  17. Accuracy of 4D Flow Measurement of Cerebrospinal Fluid Dynamics in the Cervical Spine: An In Vitro Verification Against Numerical Simulation.

    Science.gov (United States)

    Heidari Pahlavian, Soroush; Bunck, Alexander C; Thyagaraj, Suraj; Giese, Daniel; Loth, Francis; Hedderich, Dennis M; Kröger, Jan Robert; Martin, Bryn A

    2016-11-01

    Abnormal alterations in cerebrospinal fluid (CSF) flow are thought to play an important role in pathophysiology of various craniospinal disorders such as hydrocephalus and Chiari malformation. Three directional phase contrast MRI (4D Flow) has been proposed as one method for quantification of the CSF dynamics in healthy and disease states, but prior to further implementation of this technique, its accuracy in measuring CSF velocity magnitude and distribution must be evaluated. In this study, an MR-compatible experimental platform was developed based on an anatomically detailed 3D printed model of the cervical subarachnoid space and subject specific flow boundary conditions. Accuracy of 4D Flow measurements was assessed by comparison of CSF velocities obtained within the in vitro model with the numerically predicted velocities calculated from a spatially averaged computational fluid dynamics (CFD) model based on the same geometry and flow boundary conditions. Good agreement was observed between CFD and 4D Flow in terms of spatial distribution and peak magnitude of through-plane velocities with an average difference of 7.5 and 10.6% for peak systolic and diastolic velocities, respectively. Regression analysis showed lower accuracy of 4D Flow measurement at the timeframes corresponding to low CSF flow rate and poor correlation between CFD and 4D Flow in-plane velocities.

  18. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  19. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  20. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  1. Statistical clumped isotope signatures

    NARCIS (Netherlands)

    Röckmann, T.; Popa, M.E.; Krol, M.C.; Hofmann, M.E.G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of

  2. Signatures of the Invisible

    CERN Multimedia

    Strom, D

    2003-01-01

    On the Net it is possible to take a look at art from afar via Virtual Museums. One such exhibition was recently in the New York Museum of Modern Art's branch, PS1. Entitled 'Signatures of the Invisible' it was a collaborative effort between artists and physicists (1/2 page).

  3. Massively parallel signature sequencing.

    Science.gov (United States)

    Zhou, Daixing; Rao, Mahendra S; Walker, Roger; Khrebtukova, Irina; Haudenschild, Christian D; Miura, Takumi; Decola, Shannon; Vermaas, Eric; Moon, Keith; Vasicek, Thomas J

    2006-01-01

    Massively parallel signature sequencing is an ultra-high throughput sequencing technology. It can simultaneously sequence millions of sequence tags, and, therefore, is ideal for whole genome analysis. When applied to expression profiling, it reveals almost every transcript in the sample and provides its accurate expression level. This chapter describes the technology and its application in establishing stem cell transcriptome databases.

  4. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  5. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  6. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  7. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  8. Identification of host response signatures of infection.

    Energy Technology Data Exchange (ETDEWEB)

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  9. Starting a simple IMRT verification system with Elekta Monaco and Elekta IVIEWGT Planner

    International Nuclear Information System (INIS)

    Ayala Lazaro, R.; Garcia Hernandez, M. J.; Gomez Cores, S.; Jimenez Rojas, R.; Sendon del Rio, J. R.; Polo Cezon, R.; Gomez Calvar, R.

    2013-01-01

    The use of electronic devices of image portal (EPID) is considered fast, effectively and without added cost of verification of static or dynamic IMRT treatments. Its implementation as a verification tool, however, can be quite complicated. Presents an easy way of setting up this system using the method of Lee et to the. and using Elekta MONACO Planner. (Author)

  10. Method and computer product to increase accuracy of time-based software verification for sensor networks

    Science.gov (United States)

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  11. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  12. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels

    2018-01-01

    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system...

  13. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  14. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  15. Runtime Verification of C Memory Safety

    Science.gov (United States)

    Roşu, Grigore; Schulte, Wolfram; Şerbănuţă, Traian Florin

    C is the most widely used imperative system’s implementation language. While C provides types and high-level abstractions, its design goal has been to provide highest performance which often requires low-level access to memory. As a consequence C supports arbitrary pointer arithmetic, casting, and explicit allocation and deallocation. These operations are difficult to use, resulting in programs that often have software bugs like buffer overflows and dangling pointers that cause security vulnerabilities. We say a C program is memory safe, if at runtime it never goes wrong with such a memory access error. Based on standards for writing “good” C code, this paper proposes strong memory safety as the least restrictive formal definition of memory safety amenable for runtime verification. We show that although verification of memory safety is in general undecidable, even when restricted to closed, terminating programs, runtime verification of strong memory safety is a decision procedure for this class of programs. We verify strong memory safety of a program by executing the program using a symbolic, deterministic definition of the dynamic semantics. A prototype implementation of these ideas shows the feasibility of this approach.

  16. Quantum blind signature based on Two-State Vector Formalism

    Science.gov (United States)

    Qi, Su; Zheng, Huang; Qiaoyan, Wen; Wenmin, Li

    2010-11-01

    Two-State Vector Formalism (TSVF) including pre- and postselected states is a complete description of a system between two measurements. Consequently TSVF gives a perfect solution to the Mean King problem. In this paper, utilizing the dramatic correlation in the verification, we propose a quantum blind signature scheme based on TSVF. Compared with Wen's scheme, our scheme has 100% efficiency. Our scheme guarantees the unconditional security. Moreover, the proposed scheme, which is easy to implement, can be applied to E-payment system.

  17. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  18. Signatures of topological superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Yang

    2017-07-19

    The prediction and experimental discovery of topological insulators brought the importance of topology in condensed matter physics into the limelight. Topology hence acts as a new dimension along which more and more new states of matter start to emerge. One of these topological states of matter, namely topological superconductors, comes into the focus because of their gapless excitations. These gapless excitations, especially in one dimensional topological superconductors, are Majorana zero modes localized at the ends of the superconductor and exhibit exotic nonabelian statistics, which can be potentially applied to fault-tolerant quantum computation. Given their highly interesting physical properties and potential applications to quantum computation, both theorists and experimentalists spend great efforts to realize topological supercondoctors and to detect Majoranas. In two projects within this thesis, we investigate the properties of Majorana zero modes in realistic materials which are absent in simple theoretical models. We find that the superconducting proximity effect, an essential ingredient in all existing platforms for topological superconductors, plays a significant role in determining the localization property of the Majoranas. Strong proximity coupling between the normal system and the superconducting substrate can lead to strongly localized Majoranas, which can explain the observation in a recent experiment. Motivated by experiments in Molenkamp's group, we also look at realistic quantum spin Hall Josephson junctions, in which charge puddles acting as magnetic impurities are coupled to the helical edge states. We find that with this setup, the junction generically realizes an exotic 8π periodic Josephson effect, which is absent in a pristine Josephson junction. In another two projects, we propose more pronounced signatures of Majoranas that are accessible with current experimental techniques. The first one is a transport measurement, which uses

  19. Signature CERN-URSS

    CERN Document Server

    Jentschke,W

    1975-01-01

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  20. Modeling ground vehicle acoustic signatures for analysis and synthesis

    International Nuclear Information System (INIS)

    Haschke, G.; Stanfield, R.

    1995-01-01

    Security and weapon systems use acoustic sensor signals to classify and identify moving ground vehicles. Developing robust signal processing algorithms for this is expensive, particularly in presence of acoustic clutter or countermeasures. This paper proposes a parametric ground vehicle acoustic signature model to aid the system designer in understanding which signature features are important, developing corresponding feature extraction algorithms and generating low-cost, high-fidelity synthetic signatures for testing. The authors have proposed computer-generated acoustic signatures of armored, tracked ground vehicles to deceive acoustic-sensored smart munitions. They have developed quantitative measures of how accurately a synthetic acoustic signature matches those produced by actual vehicles. This paper describes parameters of the model used to generate these synthetic signatures and suggests methods for extracting these parameters from signatures of valid vehicle encounters. The model incorporates wide-bandwidth and narrow- bandwidth components that are modulated in a pseudo-random fashion to mimic the time dynamics of valid vehicle signatures. Narrow- bandwidth feature extraction techniques estimate frequency, amplitude and phase information contained in a single set of narrow frequency- band harmonics. Wide-bandwidth feature extraction techniques estimate parameters of a correlated-noise-floor model. Finally, the authors propose a method of modeling the time dynamics of the harmonic amplitudes as a means adding necessary time-varying features to the narrow-bandwidth signal components. The authors present results of applying this modeling technique to acoustic signatures recorded during encounters with one armored, tracked vehicle. Similar modeling techniques can be applied to security systems

  1. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  2. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  3. Signature Evaluation Tool (SET: a Java-based tool to evaluate and visualize the sample discrimination abilities of gene expression signatures

    Directory of Open Access Journals (Sweden)

    Lin Chi-Hung

    2008-01-01

    Full Text Available Abstract Background The identification of specific gene expression signature for distinguishing sample groups is a dominant field in cancer research. Although a number of tools have been developed to identify optimal gene expression signatures, the number of signature genes obtained is often overly large to be applied clinically. Furthermore, experimental verification is sometimes limited by the availability of wet-lab materials such as antibodies and reagents. A tool to evaluate the discrimination power of candidate genes is therefore in high demand by clinical researchers. Results Signature Evaluation Tool (SET is a Java-based tool adopting the Golub's weighted voting algorithm as well as incorporating the visual presentation of prediction strength for each array sample. SET provides a flexible and easy-to-follow platform to evaluate the discrimination power of a gene signature. Here, we demonstrated the application of SET for several purposes: (1 for signatures consisting of a large number of genes, SET offers the ability to rapidly narrow down the number of genes; (2 for a given signature (from third party analyses or user-defined, SET can re-evaluate and re-adjust its discrimination power by selecting/de-selecting genes repeatedly; (3 for multiple microarray datasets, SET can evaluate the classification capability of a signature among datasets; and (4 by providing a module to visualize the prediction strength for each sample, SET allows users to re-evaluate the discrimination power on mis-grouped or less-certain samples. Information obtained from the above applications could be useful in prognostic analyses or clinical management decisions. Conclusion Here we present SET to evaluate and visualize the sample-discrimination ability of a given gene expression signature. This tool provides a filtration function for signature identification and lies between clinical analyses and class prediction (or feature selection tools. The simplicity

  4. Identity-based key-insulated aggregate signature scheme

    Directory of Open Access Journals (Sweden)

    P. Vasudeva Reddy

    2017-07-01

    Full Text Available Private key exposure can be the most devastating attack on cryptographic schemes; as such exposure leads to the breakage of security of the scheme as a whole. In the real world scenario, this problem is perhaps the biggest threat to cryptography. The threat is increasing with users operating on low computational devices (e.g. mobile devices which hold the corresponding private key for generating signatures. To reduce the damage caused by the key exposure problem in aggregate signatures and preserve the benefits of identity-based (ID-based cryptography, we hereby propose the first key-insulated aggregate signature scheme in ID-based setting. In this scheme the leakage of temporary private keys will not compromise the security of all the remaining time periods. The security of our scheme is proven secure in the random oracle paradigm with the assumption that the Computational Diffie–Hellman (CDH problem is intractable. The proposed scheme allows an efficient verification with constant signature size, independent of the number of signers.

  5. Search for signatures in miRNAs associated with cancer

    Science.gov (United States)

    Kothandan, Ram; Biswas, Sumit

    2013-01-01

    Since the first discovery in the early 1990's, the predicted and validated population of microRNAs (miRNAs or miRs) has grown significantly. These small (~22 nucleotides long) regulators of gene expression have been implicated and associated with several genes in the cancer pathway as well. Globally, the identification and verification of microRNAs as biomarkers for cancer cell types has been the area of thrust for most miRNA biologists. However, there has been a noticeable vacuum when it comes to identifying a common signature or trademark that could be used to demarcate a miR to be associated with the development or suppression of cancer. To answer these queries, we report an in silico study involving the identification of global signatures in experimentally validated microRNAs which have been associated with cancer. This study has thrown light on the presence of significant common signatures, viz., - sequential and hybridization, which may distinguish a miR to be associated with cancer. Based on our analysis, we suggest the utility of such signatures in the design and development of algorithms for prediction of miRs involved in the cancer pathway. PMID:23861569

  6. Phenotypic signatures arising from unbalanced bacterial growth.

    Science.gov (United States)

    Tan, Cheemeng; Smith, Robert Phillip; Tsai, Ming-Chi; Schwartz, Russell; You, Lingchong

    2014-08-01

    Fluctuations in the growth rate of a bacterial culture during unbalanced growth are generally considered undesirable in quantitative studies of bacterial physiology. Under well-controlled experimental conditions, however, these fluctuations are not random but instead reflect the interplay between intra-cellular networks underlying bacterial growth and the growth environment. Therefore, these fluctuations could be considered quantitative phenotypes of the bacteria under a specific growth condition. Here, we present a method to identify "phenotypic signatures" by time-frequency analysis of unbalanced growth curves measured with high temporal resolution. The signatures are then applied to differentiate amongst different bacterial strains or the same strain under different growth conditions, and to identify the essential architecture of the gene network underlying the observed growth dynamics. Our method has implications for both basic understanding of bacterial physiology and for the classification of bacterial strains.

  7. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  8. Electronic Signature (eSig)

    Data.gov (United States)

    Department of Veterans Affairs — Beginning with the Government Paperwork Elimination Act of 1998 (GPEA), the Federal government has encouraged the use of electronic / digital signatures to enable...

  9. Expressiveness considerations of XML signatures

    DEFF Research Database (Denmark)

    Jensen, Meiko; Meyer, Christopher

    2011-01-01

    XML Signatures are used to protect XML-based Web Service communication against a broad range of attacks related to man-in-the-middle scenarios. However, due to the complexity of the Web Services specification landscape, the task of applying XML Signatures in a robust and reliable manner becomes...... more and more challenging. In this paper, we investigate this issue, describing how an attacker can still interfere with Web Services communication even in the presence of XML Signatures. Additionally, we discuss the interrelation of XML Signatures and XML Encryption, focussing on their security...

  10. Electronic Warfare Signature Measurement Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Electronic Warfare Signature Measurement Facility contains specialized mobile spectral, radiometric, and imaging measurement systems to characterize ultraviolet,...

  11. Infra-sound Signature of Lightning

    Science.gov (United States)

    Arechiga, R. O.; Badillo, E.; Johnson, J.; Edens, H. E.; Rison, W.; Thomas, R. J.

    2012-12-01

    We have analyzed thunder from over 200 lightning flashes to determine which part of thunder comes from the gas dynamic expansion of portions of the rapidly heated lightning channel and which from electrostatic field changes. Thunder signals were recorded by a ~1500 m network of 3 to 4 4-element microphone deployed in the Magdalena mountains of New Mexico in the summers of 2011 and 2012. The higher frequency infra-sound and audio-range portion of thunder is thought to come from the gas dynamic expansion, and the electrostatic mechanism gives rise to a signature infra-sound pulse peaked at a few Hz. More than 50 signature infra-sound pulses were observed in different portions of the thunder signal, with no preference towards the beginning or the end of the signal. Detection of the signature pulse occurs sometimes only for one array and sometimes for several arrays, which agrees with the theory that the pulse is highly directional (i.e., the recordings have to be in a specific position with respect to the cloud generating the pulse to be able to detect it). The detection of these pulses under quiet wind conditions by different acoustic arrays corroborates the electrostatic mechanism originally proposed by Wilson [1920], further studied by Dessler [1973] and Few [1985], observed by Bohannon [1983] and Balachandran [1979, 1983], and recently analyzed by Pasko [2009]. Pasko employed a model to explain the electrostatic-to-acoustic energy conversion and the initial compression waves in observed infrasonic pulses, which agrees with the observations we have made. We present thunder samples that exhibit signature infra-sound pulses at different times and acoustic source reconstruction to demonstrate the beaming effect.

  12. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry; Seis anos de expereincia en la planificacion y verificacion de la IMRT dinamica con portal dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-07-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  13. Signature change events: a challenge for quantum gravity?

    International Nuclear Information System (INIS)

    White, Angela; Weinfurtner, Silke; Visser, Matt

    2010-01-01

    Within the framework of either Euclidean (functional integral) quantum gravity or canonical general relativity the signature of the manifold is a priori unconstrained. Furthermore, recent developments in the emergent spacetime programme have led to a physically feasible implementation of (analogue) signature change events. This suggests that it is time to revisit the sometimes controversial topic of signature change in general relativity. Specifically, we shall focus on the behaviour of a quantum field defined on a manifold containing regions of different signature. We emphasize that regardless of the underlying classical theory, there are severe problems associated with any quantum field theory residing on a signature-changing background. (Such as the production of what is naively an infinite number of particles, with an infinite energy density.) We show how the problem of quantum fields exposed to finite regions of Euclidean-signature (Riemannian) geometry has similarities with the quantum barrier penetration problem. Finally we raise the question as to whether signature change transitions could be fully understood and dynamically generated within (modified) classical general relativity, or whether they require the knowledge of a theory of quantum gravity.

  14. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    Energy Technology Data Exchange (ETDEWEB)

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  15. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  16. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  17. Multisensors signature prediction workbench

    Science.gov (United States)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. The sensors performance is very dependent on conditions e.g. time of day, atmospheric propagation, background ... Visible camera are very efficient for diurnal fine weather conditions, long wave infrared sensors for night vision, radar systems very efficient for seeing through atmosphere and/or foliage ... Besides, multi sensors systems, combining several collocated sensors with associated algorithms of fusion, provide better efficiency (typically for Enhanced Vision Systems). But these sophisticated systems are all the more difficult to conceive, assess and qualify. In that frame, multi sensors simulation is highly required. This paper focuses on multi sensors simulation tools. A first part makes a state of the Art of such simulation workbenches with a special focus on SE-Workbench. SEWorkbench is described with regards to infrared/EO sensors, millimeter waves sensors, active EO sensors and GNSS sensors. Then a general overview of simulation of targets and backgrounds signature objectives is presented, depending on the type of simulation required (parametric studies, open loop simulation, closed loop simulation, hybridization of SW simulation and HW ...). After the objective review, the paper presents some basic requirements for simulation implementation such as the deterministic behavior of simulation, mandatory to repeat it many times for parametric studies... Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench are showed and commented.

  18. Independent Verification Survey Report For Zone 1 Of The East Tennessee Technology Park In Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    King, David A.

    2012-01-01

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs)

  19. Signatures de l'invisible

    CERN Multimedia

    CERN Press Office. Geneva

    2000-01-01

    "Signatures of the Invisible" is an unique collaboration between contemporary artists and contemporary physicists which has the potential to help redefine the relationship between science and art. "Signatures of the Invisible" is jointly organised by the London Institute - the world's largest college of art and design and CERN*, the world's leading particle physics laboratory. 12 leading visual artists:

  20. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  1. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  2. Statistical clumped isotope signatures

    Science.gov (United States)

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  3. Quantum Digital Signatures for Unconditional Safe Authenticity Protection of Medical Documentation

    Directory of Open Access Journals (Sweden)

    Arkadiusz Liber

    2015-12-01

    Full Text Available Modern medical documentation appears most often in an online form which requires some digital methods to ensure its confidentiality, integrity and authenticity. The document authenticity may be secured with the use of a signature. A classical handwritten signature is directly related to its owner by his/her psychomotor character traits. Such a signature is also connected with the material it is written on, and a writing tool. Because of these properties, a handwritten signature reflects certain close material bonds between the owner and the document. In case of modern digital signatures, the document authentication has a mathematical nature. The verification of the authenticity becomes the verification of a key instead of a human. Since 1994 it has been known that classical digital signature algorithms may not be safe because of the Shor’s factorization algorithm. To implement the modern authenticity protection of medical data, some new types of algorithms should be used. One of the groups of such algorithms is based on the quantum computations. In this paper, the analysis of the current knowledge status of Quantum Digital Signature protocols, with its basic principles, phases and common elements such as transmission, comparison and encryption, was outlined. Some of the most promising protocols for signing digital medical documentation, that fulfill the requirements for QDS, were also briefly described. We showed that, a QDS protocol with QKD components requires the equipment similar to the equipment used for a QKD, for its implementation, which is already commercially available. If it is properly implemented, it provides the shortest lifetime of qubits in comparison to other protocols. It can be used not only to sign classical messages but probably it could be well adopted to implement unconditionally safe protection of medical documentation in the nearest future, as well.

  4. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    Energy Technology Data Exchange (ETDEWEB)

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.; Kreyling, Sean J.; Henry, Michael J.; Corley, Courtney D.; Whattam, Kevin M.

    2013-07-11

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describe our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.

  5. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  6. Password-based digital signatures

    OpenAIRE

    Sivagnanasuntharam, Sangeepan

    2013-01-01

    This thesis is about implementing a digital signature scheme proposed by associate professor Kristian Gjøsteen and Oystein Thuen.The thesis explains the implementation, the challenges met and a security assessment of the implementation.

  7. Initial Semantics for Strengthened Signatures

    Directory of Open Access Journals (Sweden)

    André Hirschowitz

    2012-02-01

    Full Text Available We give a new general definition of arity, yielding the companion notions of signature and associated syntax. This setting is modular in the sense requested by Ghani and Uustalu: merging two extensions of syntax corresponds to building an amalgamated sum. These signatures are too general in the sense that we are not able to prove the existence of an associated syntax in this general context. So we have to select arities and signatures for which there exists the desired initial monad. For this, we follow a track opened by Matthes and Uustalu: we introduce a notion of strengthened arity and prove that the corresponding signatures have initial semantics (i.e. associated syntax. Our strengthened arities admit colimits, which allows the treatment of the λ-calculus with explicit substitution.

  8. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  9. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  10. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  11. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  12. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  13. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    OpenAIRE

    M. A. Lukin

    2014-01-01

    The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maxi...

  14. Nonlinear region of attraction analysis for hypersonic flight vehicles’ flight control verification

    OpenAIRE

    Jie Chen; Cun Bao Ma; Dong Song

    2017-01-01

    The stability analysis method based on region of attraction is proposed for the hypersonic flight vehicles’ flight control verification in this article. Current practice for hypersonic flight vehicles’ flight control verification is largely dependent on linear theoretical analysis and nonlinear simulation research. This problem can be improved by the nonlinear stability analysis of flight control system. Firstly, the hypersonic flight vehicles’ flight dynamic model is simplified and fitted by...

  15. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  16. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  17. Verification of Gyrokinetic codes: Theoretical background and applications

    Science.gov (United States)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  18. SIGNATURE: A workbench for gene expression signature analysis

    Directory of Open Access Journals (Sweden)

    Chang Jeffrey T

    2011-11-01

    Full Text Available Abstract Background The biological phenotype of a cell, such as a characteristic visual image or behavior, reflects activities derived from the expression of collections of genes. As such, an ability to measure the expression of these genes provides an opportunity to develop more precise and varied sets of phenotypes. However, to use this approach requires computational methods that are difficult to implement and apply, and thus there is a critical need for intelligent software tools that can reduce the technical burden of the analysis. Tools for gene expression analyses are unusually difficult to implement in a user-friendly way because their application requires a combination of biological data curation, statistical computational methods, and database expertise. Results We have developed SIGNATURE, a web-based resource that simplifies gene expression signature analysis by providing software, data, and protocols to perform the analysis successfully. This resource uses Bayesian methods for processing gene expression data coupled with a curated database of gene expression signatures, all carried out within a GenePattern web interface for easy use and access. Conclusions SIGNATURE is available for public use at http://genepattern.genome.duke.edu/signature/.

  19. Reading the Signatures of Extrasolar Planets in Debris Disks

    Science.gov (United States)

    Kuchner, Marc J.

    2009-01-01

    An extrasolar planet sculpts the famous debris dish around Fomalhaut; probably ma ny other debris disks contain planets that we could locate if only we could better recognize their signatures in the dust that surrounds them. But the interaction between planets and debris disks involves both orbital resonances and collisions among grains and rocks in the disks --- difficult processes to model simultanemus]y. I will describe new 3-D models of debris disk dynamics that incorporate both collisions and resonant trapping of dust for the first time, allowing us to decode debris disk images and read the signatures of the planets they contain.

  20. Public-key quantum digital signature scheme with one-time pad private-key

    Science.gov (United States)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  1. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  2. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  3. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  4. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  5. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  6. Runtime Verification in Distributed Computing

    NARCIS (Netherlands)

    Malakuti Khah Olun Abadi, Somayeh; Park, Jong Hyuk; Obaidat, Mohammad; Aksit, Mehmet; Bockisch, Christoph

    2011-01-01

    Runtime verification aims to check whether an application executes its behaviour as specified. Thereby the active execution trace of an application is checked in terms of the actual execution context; diagnosis and, possibly, recovery actions are taken when the specification is violated. In today’s

  7. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  8. Verification and validation for CIPRNet

    NARCIS (Netherlands)

    Voogd, J.

    2016-01-01

    In this chapter it is shown that if an appreciable risk is present in the use of Modelling and Simulation (M&S), Verification and Validation (V&V) should be employed to manage and mitigate that risk. The use of M&S in the domain of critical infrastructure (CI) will always be accompanied by such a

  9. Signature molecular descriptor : advanced applications.

    Energy Technology Data Exchange (ETDEWEB)

    Visco, Donald Patrick, Jr. (Tennessee Technological University, Cookeville, TN)

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  10. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  11. STAR-CCM+ Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methods (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.

  12. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...... as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practical experiments with the robot, the UKF is demonstrated to improve the reliability of the sensor signals...

  13. Persistence of social signatures in human communication.

    Science.gov (United States)

    Saramäki, Jari; Leicht, E A; López, Eduardo; Roberts, Sam G B; Reed-Tsochas, Felix; Dunbar, Robin I M

    2014-01-21

    The social network maintained by a focal individual, or ego, is intrinsically dynamic and typically exhibits some turnover in membership over time as personal circumstances change. However, the consequences of such changes on the distribution of an ego's network ties are not well understood. Here we use a unique 18-mo dataset that combines mobile phone calls and survey data to track changes in the ego networks and communication patterns of students making the transition from school to university or work. Our analysis reveals that individuals display a distinctive and robust social signature, captured by how interactions are distributed across different alters. Notably, for a given ego, these social signatures tend to persist over time, despite considerable turnover in the identity of alters in the ego network. Thus, as new network members are added, some old network members either are replaced or receive fewer calls, preserving the overall distribution of calls across network members. This is likely to reflect the consequences of finite resources such as the time available for communication, the cognitive and emotional effort required to sustain close relationships, and the ability to make emotional investments.

  14. Techni-dilaton signatures at LHC

    International Nuclear Information System (INIS)

    Matsuzaki, Shinya; Yamawaki, Koichi

    2012-01-01

    We explore discovery signatures of techni-dilaton (TD) at LHC. The TD was predicted long ago as a composite pseudo Nambu-Goldstone boson (pNGB) associated with the spontaneous breaking of the approximate scale symmetry in the walking technicolor (WTC) (initially dubbed 'scale-invariant technicolor'). Being pNGB, whose mass arises from the explicit scale-symmetry breaking due to the spontaneous breaking itself (dynamical mass generation), the TD as a composite scalar should have a mass M TD lighter than other techni-hadrons, say M TD ≅ 600 GeV for the typical WTC model, which is well in the discovery range of the ongoing LHC experiment. We develop a spurion method of nonlinear realization to calculate the TD couplings to the standard model (SM) particles and explicitly evaluate the TD LHC production cross sections at √s = 7 TeV times the branching ratios in terms of M TD as an input parameter for the region 200 GeV TD TD TD < 1000 GeV in the near future. We further find a characteristic signature coming from the γγ mode in the 1FM. In sharp contrast to the SM Higgs case, it provides highly enhanced cross section ∼0.10-1.0 fb at around the TD mass ≅ 600 GeV, which is large enough to be discovered during the first few year's run at LHC. (author)

  15. Five Guidelines for Selecting Hydrological Signatures

    Science.gov (United States)

    McMillan, H. K.; Westerberg, I.; Branger, F.

    2017-12-01

    Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to

  16. MUSE WFM AO Science Verification

    Science.gov (United States)

    Leibundgut, B.; Bacon, R.; Jaffé, Y. L.; Johnston, E.; Kuntschner, H.; Selman, F.; Valenti, E.; Vernet, J.; Vogt, F.

    2017-12-01

    The goal of Science Verification (SV) as part of the transition into operations is to carry out scientific observations to test the end-to-end operations of a new instrument or new instrument modes. The Multi Unit Spectroscopic Explorer, (MUSE; Bacon et al., 2010), at the Very Large Telescope (VLT) can be operated in several modes. The wide-field mode has been offered since Period 94 (October 2014) for natural-seeing observations. With the commissioning of the Adaptive Optics Facility (AOF; Arsenault et al., 2017) the wide-field mode can be supported by ground-layer adaptive optics through four artificial laser guide stars and the adaptive optics module, Ground Atmospheric Layer Adaptive OptiCs for Spectroscopic Imaging (GALACSI). The MUSE wide-field mode adaptive optics Science Verification (hereafter referred to MUSE WFM AO SV) was scheduled from 12–14 August 2017. Out of 41 submitted proposals, 19 observing programmes were scheduled, covering a wide range of science topics and amounting to an allocation of 42 hours. This included sufficient oversubscription to cover all expected observing conditions. Due to inclement weather during the original SV nights, two more nights were allocated on 16 and 17 September 2017 to observe more programmes. In total, seven programmes were completed, six programmes received partial data, and the remaining six projects could not be started. We summarise here the planning, execution and first results from the Science Verification.

  17. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  18. Motif signatures of transcribed enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios

    2017-09-14

    In mammalian cells, transcribed enhancers (TrEn) play important roles in the initiation of gene expression and maintenance of gene expression levels in spatiotemporal manner. One of the most challenging questions in biology today is how the genomic characteristics of enhancers relate to enhancer activities. This is particularly critical, as several recent studies have linked enhancer sequence motifs to specific functional roles. To date, only a limited number of enhancer sequence characteristics have been investigated, leaving space for exploring the enhancers genomic code in a more systematic way. To address this problem, we developed a novel computational method, TELS, aimed at identifying predictive cell type/tissue specific motif signatures. We used TELS to compile a comprehensive catalog of motif signatures for all known TrEn identified by the FANTOM5 consortium across 112 human primary cells and tissues. Our results confirm that distinct cell type/tissue specific motif signatures characterize TrEn. These signatures allow discriminating successfully a) TrEn from random controls, proxy of non-enhancer activity, and b) cell type/tissue specific TrEn from enhancers expressed and transcribed in different cell types/tissues. TELS codes and datasets are publicly available at http://www.cbrc.kaust.edu.sa/TELS.

  19. Signature simulation of mixed materials

    Science.gov (United States)

    Carson, Tyler D.; Salvaggio, Carl

    2015-05-01

    Soil target signatures vary due to geometry, chemical composition, and scene radiometry. Although radiative transfer models and function-fit physical models may describe certain targets in limited depth, the ability to incorporate all three signature variables is difficult. This work describes a method to simulate the transient signatures of soil by first considering scene geometry synthetically created using 3D physics engines. Through the assignment of spectral data from the Nonconventional Exploitation Factors Data System (NEFDS), the synthetic scene is represented as a physical mixture of particles. Finally, first principles radiometry is modeled using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. With DIRSIG, radiometric and sensing conditions were systematically manipulated to produce and record goniometric signatures. The implementation of this virtual goniometer allows users to examine how a target bidirectional reflectance distribution function (BRDF) will change with geometry, composition, and illumination direction. By using 3D computer graphics models, this process does not require geometric assumptions that are native to many radiative transfer models. It delivers a discrete method to circumnavigate the significant cost of time and treasure associated with hardware-based goniometric data collections.

  20. Epigenetic Signatures of Cigarette Smoking

    NARCIS (Netherlands)

    R. Joehanes (Roby); Just, A.C. (Allan C.); R.E. Marioni (Riccardo); L.C. Pilling (Luke); L.M. Reynolds (Lindsay); Mandaviya, P.R. (Pooja R.); W. Guan (Weihua); Xu, T. (Tao); C.E. Elks (Cathy); Aslibekyan, S. (Stella); H. Moreno-Macías (Hortensia); J.A. Smith (Jennifer A); J. Brody (Jennifer); Dhingra, R. (Radhika); P. Yousefi (Paul); J.S. Pankow (James); Kunze, S. (Sonja); Shah, S.H. (Sonia H.); A.F. McRae (Allan F.); K. Lohman (Kurt); Sha, J. (Jin); D. Absher (Devin); L. Ferrucci (Luigi); Zhao, W. (Wei); E.W. Demerath (Ellen); J. Bressler (Jan); M.L. Grove (Megan); T. Huan (Tianxiao); C. Liu (Chunyu); Mendelson, M.M. (Michael M.); C. Yao (Chen); D.P. Kiel (Douglas P.); A. Peters (Annette); R. Wang-Sattler (Rui); P.M. Visscher (Peter); N.R. Wray (Naomi); J.M. Starr (John); Ding, J. (Jingzhong); Rodriguez, C.J. (Carlos J.); N.J. Wareham (Nick); Irvin, M.R. (Marguerite R.); Zhi, D. (Degui); M. Barrdahl (Myrto); P. Vineis (Paolo); Ambatipudi, S. (Srikant); A.G. Uitterlinden (André); A. Hofman (Albert); Schwartz, J. (Joel); Colicino, E. (Elena); Hou, L. (Lifang); Vokonas, P.S. (Pantel S.); D.G. Hernandez (Dena); A. Singleton (Andrew); S. Bandinelli (Stefania); S.T. Turner (Stephen); E.B. Ware (Erin B.); Smith, A.K. (Alicia K.); T. Klengel (Torsten); E.B. Binder (Elisabeth B.); B.M. Psaty (Bruce); K.D. Taylor (Kent); S.A. Gharib (Sina); Swenson, B.R. (Brenton R.); Liang, L. (Liming); D.L. Demeo (Dawn L.); G.T. O'Connor (George); Z. Herceg (Zdenko); Ressler, K.J. (Kerry J.); K.N. Conneely (Karen N.); N. Sotoodehnia (Nona); Kardia, S.L.R. (Sharon L. R.); D. Melzer (David); A.A. Baccarelli (Andrea A.); J.B.J. van Meurs (Joyce); I. Romieu (Isabelle); D.K. Arnett (Donna); Ong, K.K. (Ken K.); Y. Liu (YongMei); M. Waldenberger (Melanie); I.J. Deary (Ian J.); M. Fornage (Myriam); D. Levy (Daniel); S.J. London (Stephanie J.)

    2016-01-01

    textabstractBackground-DNA methylation leaves a long-term signature of smoking exposure and is one potential mechanism by which tobacco exposure predisposes to adverse health outcomes, such as cancers, osteoporosis, lung, and cardiovascular disorders. Methods and Results-To comprehensively determine

  1. Signature Pedagogy in Theatre Arts

    Science.gov (United States)

    Kornetsky, Lisa

    2017-01-01

    Critique in undergraduate theatre programs is at the heart of training actors at all levels. It is accepted as the signature pedagogy and is practiced in multiple ways. This essay defines critique and presents the case for why it is used as the single most important way that performers come to understand the language, values, and discourse of the…

  2. Galaxy interactions : The HI signature

    NARCIS (Netherlands)

    Sancisi, R; Barnes, JE; Sanders, DB

    1999-01-01

    HI observations are an excellent tool for investigating tidal interactions. Ongoing major and minor interactions which can lead to traumatic mergers or to accretion and the triggering of star formation, show distinct HI signatures. Interactions and mergers in the recent past can also be recognized

  3. Dynamics

    CERN Document Server

    Goodman, Lawrence E

    2001-01-01

    Beginning text presents complete theoretical treatment of mechanical model systems and deals with technological applications. Topics include introduction to calculus of vectors, particle motion, dynamics of particle systems and plane rigid bodies, technical applications in plane motions, theory of mechanical vibrations, and more. Exercises and answers appear in each chapter.

  4. Design and Implementation of a Mobile Voting System Using a Novel Oblivious and Proxy Signature

    Directory of Open Access Journals (Sweden)

    Shin-Yan Chiou

    2017-01-01

    Full Text Available Electronic voting systems can make the voting process much more convenient. However, in such systems, if a server signs blank votes before users vote, it may cause undue multivoting. Furthermore, if users vote before the signing of the server, voting information will be leaked to the server and may be compromised. Blind signatures could be used to prevent leaking voting information from the server; however, malicious users could produce noncandidate signatures for illegal usage at that time or in the future. To overcome these problems, this paper proposes a novel oblivious signature scheme with a proxy signature function to satisfy security requirements such as information protection, personal privacy, and message verification and to ensure that no one can cheat other users (including the server. We propose an electronic voting system based on the proposed oblivious and proxy signature scheme and implement this scheme in a smartphone application to allow users to vote securely and conveniently. Security analyses and performance comparisons are provided to show the capability and efficiency of the proposed scheme.

  5. Optical identity authentication scheme based on elliptic curve digital signature algorithm and phase retrieval algorithm.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2013-08-10

    An optical identity authentication scheme based on the elliptic curve digital signature algorithm (ECDSA) and phase retrieval algorithm (PRA) is proposed. In this scheme, a user's certification image and the quick response code of the user identity's keyed-hash message authentication code (HMAC) with added noise, serving as the amplitude and phase restriction, respectively, are digitally encoded into two phase keys using a PRA in the Fresnel domain. During the authentication process, when the two phase keys are presented to the system and illuminated by a plane wave of correct wavelength, an output image is generated in the output plane. By identifying whether there is a match between the amplitude of the output image and all the certification images pre-stored in the database, the system can thus accomplish a first-level verification. After the confirmation of first-level verification, the ECDSA signature is decoded from the phase part of the output image and verified to allege whether the user's identity is legal or not. Moreover, the introduction of HMAC makes it almost impossible to forge the signature and hence the phase keys thanks to the HMAC's irreversible property. Theoretical analysis and numerical simulations both validate the feasibility of our proposed scheme.

  6. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  7. Hybrids as a signature of quark-gluon plasma

    Energy Technology Data Exchange (ETDEWEB)

    Abbas, Afsar; Paria, Lina [Institute of Physics, Sachivalaya Marg, Bhubaneswar-751005 (India)

    1997-07-01

    We show that the dynamics of the quark - gluon plasma is such that during hadronization the creation of hybrids will predominate over the creation of mesons, giving a novel signature of the existence of QGP. At T=0 the qq-barg hybrids are known to decay strongly into a pair of mesons. We find that at temperatures relevant to the QGP, this channel is forbidden. This would lead to significant modifications of the photonic signals of the QGP. (author)

  8. Military Tactical Aircraft Engine Noise Matching to Infrared Signatures

    Science.gov (United States)

    2016-12-16

    Alternating Current AFB – Air Force Base CFD – Computational Fluid Dynamics CO2 – Carbon Dioxide FY – Fiscal Year IR – Infrared KAFB – Kirtland Air...thereby rendering insignificant the absorptive effects of atmosphere. Because the plume is the primary source of acoustic emissions in situations that...N/A This report builds on theoretical analysis of jet engine infrared signatures and their potential relationships to jet engine acoustic emissions

  9. Nonlinear control of magnetic signatures

    Science.gov (United States)

    Niemoczynski, Bogdan

    Magnetic properties of ferrite structures are known to cause fluctuations in Earth's magnetic field around the object. These fluctuations are known as the object's magnetic signature and are unique based on the object's geometry and material. It is a common practice to neutralize magnetic signatures periodically after certain time intervals, however there is a growing interest to develop real time degaussing systems for various applications. Development of real time degaussing system is a challenging problem because of magnetic hysteresis and difficulties in measurement or estimation of near-field flux data. The goal of this research is to develop a real time feedback control system that can be used to minimize magnetic signatures for ferrite structures. Experimental work on controlling the magnetic signature of a cylindrical steel shell structure with a magnetic disturbance provided evidence that the control process substantially increased the interior magnetic flux. This means near field estimation using interior sensor data is likely to be inaccurate. Follow up numerical work for rectangular and cylindrical cross sections investigated variations in shell wall flux density under a variety of ambient excitation and applied disturbances. Results showed magnetic disturbances could corrupt interior sensor data and magnetic shielding due to the shell walls makes the interior very sensitive to noise. The magnetic flux inside the shell wall showed little variation due to inner disturbances and its high base value makes it less susceptible to noise. This research proceeds to describe a nonlinear controller to use the shell wall data as an input. A nonlinear plant model of magnetics is developed using a constant tau to represent domain rotation lag and a gain function k to describe the magnetic hysteresis curve for the shell wall. The model is justified by producing hysteresis curves for multiple materials, matching experimental data using a particle swarm algorithm, and

  10. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  11. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  12. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    Science.gov (United States)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  13. Spectral signature selection for mapping unvegetated soils

    Science.gov (United States)

    May, G. A.; Petersen, G. W.

    1975-01-01

    Airborne multispectral scanner data covering the wavelength interval from 0.40-2.60 microns were collected at an altitude of 1000 m above the terrain in southeastern Pennsylvania. Uniform training areas were selected within three sites from this flightline. Soil samples were collected from each site and a procedure developed to allow assignment of scan line and element number from the multispectral scanner data to each sampling location. These soil samples were analyzed on a spectrophotometer and laboratory spectral signatures were derived. After correcting for solar radiation and atmospheric attenuation, the laboratory signatures were compared to the spectral signatures derived from these same soils using multispectral scanner data. Both signatures were used in supervised and unsupervised classification routines. Computer-generated maps using the laboratory and multispectral scanner derived signatures resulted in maps that were similar to maps resulting from field surveys. Approximately 90% agreement was obtained between classification maps produced using multispectral scanner derived signatures and laboratory derived signatures.

  14. A New Adaptive Structural Signature for Symbol Recognition by Using a Galois Lattice as a Classifier.

    Science.gov (United States)

    Coustaty, M; Bertet, K; Visani, M; Ogier, J

    2011-08-01

    In this paper, we propose a new approach for symbol recognition using structural signatures and a Galois lattice as a classifier. The structural signatures are based on topological graphs computed from segments which are extracted from the symbol images by using an adapted Hough transform. These structural signatures-that can be seen as dynamic paths which carry high-level information-are robust toward various transformations. They are classified by using a Galois lattice as a classifier. The performance of the proposed approach is evaluated based on the GREC'03 symbol database, and the experimental results we obtain are encouraging.

  15. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  16. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  18. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  19. Runtime Verification with State Estimation

    Science.gov (United States)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  20. Cosmological transitions with changes in the signature of the metric

    International Nuclear Information System (INIS)

    Sakharov, A.D.

    1984-01-01

    It is conjectured that there exist states of the physical continuum which include regions with different signatures of the metric and that the observed Universe and an infinite number of other Universes arose as a result of quantum transitions with a change in the signature of the metric. The Lagrangian in such a theory must satisfy conditions of non-negativity in the regions with even signature. Signature here means the number of time coordinates. The induced gravitational Lagrangian in a conformally invariant theory of Kaluza-Klein type evidently satisfies this requirement and leads to effective equations of the gravitational theory of macroscopic space identical to the equations of the general theory of relativity. It is suggested that in our Universe there exist in addition to the observable (macroscopic) time dimension two or some other even number of compactified time dimensions. It is suggested that the formation of a Euclidean region in the center of a black hole or in the cosmological contraction of the Universe (if it is predetermined by the dynamics) is a possible outcome of gravitational collapse

  1. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  2. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan; (b...

  3. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  4. 18 CFR 158.5 - Verification.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having knowledge...

  5. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  6. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  7. 21 CFR 123.8 - Verification.

    Science.gov (United States)

    2010-04-01

    ..., including signing and dating, by an individual who has been trained in accordance with § 123.10, of the... FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION FISH AND FISHERY PRODUCTS General Provisions § 123.8 Verification. (a) Overall verification. Every...

  8. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  9. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  10. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  11. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Science.gov (United States)

    2010-07-22

    ...) (authorizing use of ``reasonable data compression or formatting technologies''). Several commenters requested... relationship between the National Government and the States, or on the distribution of power and...

  12. On-line signature verification using Gaussian Mixture Models and small-sample learning strategies

    Directory of Open Access Journals (Sweden)

    Gabriel Jaime Zapata-Zapata

    2016-01-01

    Full Text Available El artículo aborda el problema de entrenamiento de sistemas de verificación de firmas en línea cuando el número de muestras disponibles para el entrenamiento es bajo, debido a que en la mayoría de situaciones reales el número de firmas disponibles por usuario es muy limitado. El artículo evalúa nueve diferentes estrategias de clasificación basadas en modelos de mezclas de Gaussianas (GMM por sus siglas en inglés y la estrategia conocida como modelo histórico universal (UBM por sus siglas en inglés, la cual está diseñada con el objetivo de trabajar bajo condiciones de menor número de muestras. Las estrategias de aprendizaje de los GMM incluyen el algoritmo convencional de Esperanza y Maximización, y una aproximación Bayesiana basada en aprendizaje variacional. Las firmas son caracterizadas principalmente en términos de velocidades y aceleraciones de los patrones de escritura a mano de los usuarios. Los resultados muestran que cuando se evalúa el sistema en una configuración genuino vs. impostor, el método GMM-UBM es capaz de mantener una precisión por encima del 93%, incluso en casos en los que únicamente se usa para entrenamiento el 20% de las muestras disponibles (equivalente a 5 firmas, mientras que la combinación de un modelo Bayesiano UBM con una Máquina de Soporte Vectorial (SVM por sus siglas en inglés, modelo conocido como GMM-Supervector, logra un 99% de acierto cuando las muestras de entrenamiento exceden las 20. Por otro lado, cuando se simula un ambiente real en el que no están disponibles muestras impostoras y se usa

  13. Verification and validation methodology of training simulators

    International Nuclear Information System (INIS)

    Hassan, M.W.; Khan, N.M.; Ali, S.; Jafri, M.N.

    1997-01-01

    A full scope training simulator comprising of 109 plant systems of a 300 MWe PWR plant contracted by Pakistan Atomic Energy Commission (PAEC) from China is near completion. The simulator has its distinction in the sense that it will be ready prior to fuel loading. The models for the full scope training simulator have been developed under APROS (Advanced PROcess Simulator) environment developed by the Technical Research Center (VTT) and Imatran Voima (IVO) of Finland. The replicated control room of the plant is contracted from Shanghai Nuclear Engineering Research and Design Institute (SNERDI), China. The development of simulation models to represent all the systems of the target plant that contribute to plant dynamics and are essential for operator training has been indigenously carried out at PAEC. This multifunctional simulator is at present under extensive testing and will be interfaced with the control planes in March 1998 so as to realize a full scope training simulator. The validation of the simulator is a joint venture between PAEC and SNERDI. For the individual components and the individual plant systems, the results have been compared against design data and PSAR results to confirm the faithfulness of the simulator against the physical plant systems. The reactor physics parameters have been validated against experimental results and benchmarks generated using design codes. Verification and validation in the integrated state has been performed against the benchmark transients conducted using the RELAP5/MOD2 for the complete spectrum of anticipated transient covering the well known five different categories. (author)

  14. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  15. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  16. Abstraction of Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer

    2011-01-01

    To enable formal verification of a dynamical system, given by a set of differential equations, it is abstracted by a finite state model. This allows for application of methods for model checking. Consequently, it opens the possibility of carrying out the verification of reachability and timing re...

  17. The verification basis of the ESPROSE.m code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Freeman, K.; Chen, X. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the ESPROSE.m code is presented and implemented. The approach consists of a stepwise testing procedure from wave dynamics aspects to explosion coupling at the local level, and culminates with the consideration of propagating explosive events. Each step in turn consists of an array of analytical and experimental tests. The results indicate that, given the premixture composition, the prediction of energetics of large scale explosions in multidimensional geometries is within reach. The main need identified is for constitutive laws for microinteractions with reactor materials; however, reasonably conservative assessments are presently possible. (author)

  18. Keystroke dynamics in the pre-touchscreen era

    Science.gov (United States)

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.

    2013-01-01

    Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568

  19. Keystroke Dynamics in the pre-Touchscreen Era

    Directory of Open Access Journals (Sweden)

    Nasir eAhmad

    2013-12-01

    Full Text Available Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realised via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable, and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilise multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view towards indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  20. Infrared signatures for remote sensing

    Energy Technology Data Exchange (ETDEWEB)

    McDowell, R.S.; Sharpe, S.W.; Kelly, J.F.

    1994-04-01

    PNL`s capabilities for infrared and near-infrared spectroscopy include tunable-diode-laser (TDL) systems covering 300--3,000 cm{sup {minus}1} at <10-MHz bandwidth; a Bruker Fourier-transform infrared (FTIR) spectrometer for the near- to far-infrared at 50-MHz resolution; and a stable line-tunable, 12-w cw CO{sub 2} laser. PNL also has a beam expansion source with a 12-cm slit, which provides a 3-m effective path for gases at {approximately}10 K, giving a Doppler width of typically 10 MHz; and long-path static gas cells (to 100 m). In applying this equipment to signatures work, the authors emphasize the importance of high spectral resolution for detecting and identifying atmospheric interferences; for identifying the optimum analytical frequencies; for deriving, by spectroscopic analysis, the molecular parameters needed for modeling; and for obtaining data on species and/or bands that are not in existing databases. As an example of such spectroscopy, the authors have assigned and analyzed the C-Cl stretching region of CCl{sub 4} at 770--800 cm{sup {minus}1}. This is an important potential signature species whose IR absorption has remained puzzling because of the natural isotopic mix, extensive hot-band structure, and a Fermi resonance involving a nearby combination band. Instrument development projects include the IR sniffer, a small high-sensitivity, high-discrimination (Doppler-limited) device for fence-line or downwind monitoring that is effective even in regions of atmospheric absorption; preliminary work has achieved sensitivities at the low-ppb level. Other work covers trace species detection with TDLs, and FM-modulated CO{sub 2} laser LIDAR. The authors are planning a field experiment to interrogate the Hanford tank farm for signature species from Rattlesnake Mountain, a standoff of ca. 15 km, to be accompanied by simultaneous ground-truthing at the tanks.

  1. Metabolic signatures of bacterial vaginosis.

    Science.gov (United States)

    Srinivasan, Sujatha; Morgan, Martin T; Fiedler, Tina L; Djukovic, Danijel; Hoffman, Noah G; Raftery, Daniel; Marrazzo, Jeanne M; Fredricks, David N

    2015-04-14

    Bacterial vaginosis (BV) is characterized by shifts in the vaginal microbiota from Lactobacillus dominant to a microbiota with diverse anaerobic bacteria. Few studies have linked specific metabolites with bacteria found in the human vagina. Here, we report dramatic differences in metabolite compositions and concentrations associated with BV using a global metabolomics approach. We further validated important metabolites using samples from a second cohort of women and a different platform to measure metabolites. In the primary study, we compared metabolite profiles in cervicovaginal lavage fluid from 40 women with BV and 20 women without BV. Vaginal bacterial representation was determined using broad-range PCR with pyrosequencing and concentrations of bacteria by quantitative PCR. We detected 279 named biochemicals; levels of 62% of metabolites were significantly different in women with BV. Unsupervised clustering of metabolites separated women with and without BV. Women with BV have metabolite profiles marked by lower concentrations of amino acids and dipeptides, concomitant with higher levels of amino acid catabolites and polyamines. Higher levels of the signaling eicosanoid 12-hydroxyeicosatetraenoic acid (12-HETE), a biomarker for inflammation, were noted in BV. Lactobacillus crispatus and Lactobacillus jensenii exhibited similar metabolite correlation patterns, which were distinct from correlation patterns exhibited by BV-associated bacteria. Several metabolites were significantly associated with clinical signs and symptoms (Amsel criteria) used to diagnose BV, and no metabolite was associated with all four clinical criteria. BV has strong metabolic signatures across multiple metabolic pathways, and these signatures are associated with the presence and concentrations of particular bacteria. Bacterial vaginosis (BV) is a common but highly enigmatic condition that is associated with adverse outcomes for women and their neonates. Small molecule metabolites in the

  2. Magnetotail processes and their ionospheric signatures

    Science.gov (United States)

    Ferdousi, B.; Raeder, J.; Zesta, E.; Murphy, K. R.; Cramer, W. D.

    2017-12-01

    In-situ observations in the magnetotail are sparse and limited to single point measurements. In the ionosphere, on the other hand, there is a broad range of observations, including magnetometers, auroral imagers, and various radars. Since the ionosphere is to some extent a mirror of plasmasheet processes it can be used as a monitor of magnetotail dynamics. Thus, it is of great importance to understand the coupling between the ionosphere and the magnetosphere in order to properly interpret the ionosphere and ground observations in terms of magnetotail dynamics. For this purpose, the global magnetohydrodynamic model OpenGGCM is used to investigate magnetosphere-ionosphere coupling. One of the key processes in magnetotail dynamics are bursty bulk flows (BBFs) which are the major means by which momentum and energy get transferred through the magnetotail and down to the ionosphere. BBFs often manifested in the ionosphere as auroral streamers. This study focuses on mapping such flow bursts from the magnetotail to the ionosphere along the magnetic field lines for three states of the magnetotail: pre-substorm onset through substorm expansion and during steady magnetospheric convection (SMC) following the substorm. We find that the orientation of streamers in the ionosphere differes for different local times, and that, for both tail and ionospheric signatures, activity increases during the SCM configutation compared to the pre-onset and quiet times. We also find that the background convection in the tail impacts the direction and deflection of the BBFs and the subsequent orientation of the auroral streamers in the ionosphere.

  3. Physical description of nuclear materials identification system (NMIS) signatures

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Mullens, J.A.; Mattingly, J.K.; Valentine, T.E.

    2000-01-01

    This paper describes all time and frequency analysis parameters measured with a new correlation processor (capability up to 1 GHz sampling rates and up to five input data channels) for three input channels: (1) the 252 Cf source ionization chamber; (2) a detection channel; and (3) a second detection channel. An intuitive and physical description of the various measured quantities is given as well as a brief mathematical description and a brief description of how the data are acquired. If the full five-channel capability is used, the number of measured quantities increases in number but not in type. The parameters provided by this new processor can be divided into two general classes: time analysis signatures and their related frequency analysis signatures. The time analysis signatures include the number of time m pulses occurs in a time interval, that is triggered randomly, upon a detection event, or upon a source fission event triggered. From the number of pulses in a time interval, the moments, factorial moments, and Feynmann variance can be obtained. Recent implementations of third- and fourth-order time and frequency analysis signatures in this processor are also briefly described. Thus, this processor used with a timed source of input neutrons contains all of the information from a pulsed neutron measurement, one and two detector Rossi-α measurements, multiplicity measurements, and third- and fourth-order correlation functions. This processor, although originally designed for active measurements with a 252 Cf interrogating source, has been successfully used passively (without 252 Cf source) for systems with inherent neutron sources such as fissile systems of plutonium. Data from active measurements with an 18.75 kg highly enriched uranium (93.2 wt%, 235 U) metal casting for storage are presented to illustrate some of the various time and frequency analysis parameters. This processor, which is a five-channel time correlation analyzer with time channel widths

  4. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  5. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  6. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  7. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  8. Design And Implementation of Low Area/Power Elliptic Curve Digital Signature Hardware Core

    Directory of Open Access Journals (Sweden)

    Anissa Sghaier

    2017-06-01

    Full Text Available The Elliptic Curve Digital Signature Algorithm(ECDSA is the analog to the Digital Signature Algorithm(DSA. Based on the elliptic curve, which uses a small key compared to the others public-key algorithms, ECDSA is the most suitable scheme for environments where processor power and storage are limited. This paper focuses on the hardware implementation of the ECDSA over elliptic curveswith the 163-bit key length recommended by the NIST (National Institute of Standards and Technology. It offers two services: signature generation and signature verification. The proposed processor integrates an ECC IP, a Secure Hash Standard 2 IP (SHA-2 Ip and Random Number Generator IP (RNG IP. Thus, all IPs will be optimized, and different types of RNG will be implemented in order to choose the most appropriate one. A co-simulation was done to verify the ECDSA processor using MATLAB Software. All modules were implemented on a Xilinx Virtex 5 ML 50 FPGA platform; they require respectively 9670 slices, 2530 slices and 18,504 slices. FPGA implementations represent generally the first step for obtaining faster ASIC implementations. Further, the proposed design was also implemented on an ASIC CMOS 45-nm technology; it requires a 0.257 mm2 area cell achieving a maximum frequency of 532 MHz and consumes 63.444 (mW. Furthermore, in this paper, we analyze the security of our proposed ECDSA processor against the no correctness check for input points and restart attacks.

  9. Theoretical Characterizaiton of Visual Signatures

    Science.gov (United States)

    Kashinski, D. O.; Chase, G. M.; di Nallo, O. E.; Scales, A. N.; Vanderley, D. L.; Byrd, E. F. C.

    2015-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled vibrational frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A full statistical analysis and reliability assessment of computational results is currently underway. A comparison of theoretical results to experimental values found in the literature is used to assess any affects of functional choice and basis set on calculation accuracy. The status of this work will be presented at the conference. Work supported by the ARL, DoD HPCMP, and USMA.

  10. Intrusion signature creation via clustering anomalies

    Science.gov (United States)

    Hendry, Gilbert R.; Yang, Shanchieh J.

    2008-03-01

    Current practices for combating cyber attacks typically use Intrusion Detection Systems (IDSs) to detect and block multistage attacks. Because of the speed and impacts of new types of cyber attacks, current IDSs are limited in providing accurate detection while reliably adapting to new attacks. In signature-based IDS systems, this limitation is made apparent by the latency from day zero of an attack to the creation of an appropriate signature. This work hypothesizes that this latency can be shortened by creating signatures via anomaly-based algorithms. A hybrid supervised and unsupervised clustering algorithm is proposed for new signature creation. These new signatures created in real-time would take effect immediately, ideally detecting new attacks. This work first investigates a modified density-based clustering algorithm as an IDS, with its strengths and weaknesses identified. A signature creation algorithm leveraging the summarizing abilities of clustering is investigated. Lessons learned from the supervised signature creation are then leveraged for the development of unsupervised real-time signature classification. Automating signature creation and classification via clustering is demonstrated as satisfactory but with limitations.

  11. Institute of Geophysics, Planetary Physics, and Signatures

    Data.gov (United States)

    Federal Laboratory Consortium — The Institute of Geophysics, Planetary Physics, and Signatures at Los Alamos National Laboratory is committed to promoting and supporting high quality, cutting-edge...

  12. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  13. An Arbitrated Quantum Signature Scheme without Entanglement"*

    International Nuclear Information System (INIS)

    Li Hui-Ran; Luo Ming-Xing; Peng Dai-Yuan; Wang Xiao-Jun

    2017-01-01

    Several quantum signature schemes are recently proposed to realize secure signatures of quantum or classical messages. Arbitrated quantum signature as one nontrivial scheme has attracted great interests because of its usefulness and efficiency. Unfortunately, previous schemes cannot against Trojan horse attack and DoS attack and lack of the unforgeability and the non-repudiation. In this paper, we propose an improved arbitrated quantum signature to address these secure issues with the honesty arbitrator. Our scheme takes use of qubit states not entanglements. More importantly, the qubit scheme can achieve the unforgeability and the non-repudiation. Our scheme is also secure for other known quantum attacks . (paper)

  14. An Arbitrated Quantum Signature Scheme without Entanglement*

    Science.gov (United States)

    Li, Hui-Ran; Luo, Ming-Xing; Peng, Dai-Yuan; Wang, Xiao-Jun

    2017-09-01

    Several quantum signature schemes are recently proposed to realize secure signatures of quantum or classical messages. Arbitrated quantum signature as one nontrivial scheme has attracted great interests because of its usefulness and efficiency. Unfortunately, previous schemes cannot against Trojan horse attack and DoS attack and lack of the unforgeability and the non-repudiation. In this paper, we propose an improved arbitrated quantum signature to address these secure issues with the honesty arbitrator. Our scheme takes use of qubit states not entanglements. More importantly, the qubit scheme can achieve the unforgeability and the non-repudiation. Our scheme is also secure for other known quantum attacks.

  15. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...... the boundaries of automatic verification and the connections between TAPN and the model of timed automata. Finally, we will mention the tool TAPAAL that supports modelling, simulation and verification of TAPN and discuss a small case study of alternating bit protocol....

  16. Privacy Preserving Iris Based Biometric Identity Verification

    Directory of Open Access Journals (Sweden)

    Przemyslaw Strzelczyk

    2011-08-01

    Full Text Available Iris biometrics is considered one of the most accurate and robust methods of identity verification. Individually unique iris features can be presented in a compact binary form easily compared with reference template to confirm identity. However, when templates or features are disclosed, iris biometrics is no longer suitable for verification. Therefore, there is a need to perform iris feature matching without revealing the features itself and reference template. The paper proposes an extension of the standard iris-based verification protocol that introduces features and a template locking mechanism, which guarantees that no sensitive information is exposed.Article in English

  17. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  18. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  19. Constraint specialisation in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute...... underlying the clauses. Experimental results on verification problems show that this is an effective transformation, both in our own verification tools (based on a convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  20. Detection of Damage in Operating Wind Turbines by Signature Distances

    Directory of Open Access Journals (Sweden)

    James F. Manwell

    2013-01-01

    Full Text Available Wind turbines operate in the atmospheric boundary layer and are subject to complex random loading. This precludes using a deterministic response of healthy turbines as the baseline for identifying the effect of damage on the measured response of operating turbines. In the absence of such a deterministic response, the stochastic dynamic response of the tower to a shutdown maneuver is found to be affected distinctively by damage in contrast to wind. Such a dynamic response, however, cannot be established for the blades. As an alternative, the estimate of blade damage is sought through its effect on the third or fourth modal frequency, each found to be mostly unaffected by wind. To discern the effect of damage from the wind effect on these responses, a unified method of damage detection is introduced that accommodates different responses. In this method, the dynamic responses are transformed to surfaces via continuous wavelet transforms to accentuate the effect of wind or damage on the dynamic response. Regions of significant deviations between these surfaces are then isolated in their corresponding planes to capture the change signatures. The image distances between these change signatures are shown to produce consistent estimates of damage for both the tower and the blades in presence of varying wind field profiles.

  1. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  2. DNA methylation signatures of educational attainment

    Science.gov (United States)

    van Dongen, Jenny; Bonder, Marc Jan; Dekkers, Koen F.; Nivard, Michel G.; van Iterson, Maarten; Willemsen, Gonneke; Beekman, Marian; van der Spek, Ashley; van Meurs, Joyce B. J.; Franke, Lude; Heijmans, Bastiaan T.; van Duijn, Cornelia M.; Slagboom, P. Eline; Boomsma, Dorret I.; BIOS consortium

    2018-03-01

    Educational attainment is a key behavioural measure in studies of cognitive and physical health, and socioeconomic status. We measured DNA methylation at 410,746 CpGs (N = 4152) and identified 58 CpGs associated with educational attainment at loci characterized by pleiotropic functions shared with neuronal, immune and developmental processes. Associations overlapped with those for smoking behaviour, but remained after accounting for smoking at many CpGs: Effect sizes were on average 28% smaller and genome-wide significant at 11 CpGs after adjusting for smoking and were 62% smaller in never smokers. We examined sources and biological implications of education-related methylation differences, demonstrating correlations with maternal prenatal folate, smoking and air pollution signatures, and associations with gene expression in cis, dynamic methylation in foetal brain, and correlations between blood and brain. Our findings show that the methylome of lower-educated people resembles that of smokers beyond effects of their own smoking behaviour and shows traces of various other exposures.

  3. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  5. The PASCAL-HDM Verification System

    Science.gov (United States)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  6. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  7. Procedure Verification and Validation Toolset Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  8. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  9. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  10. FLEXible Damage Detection and Verification System

    Data.gov (United States)

    National Aeronautics and Space Administration — This project expands on the previously demonstrated Flat Surface Damage Detection System (FSDDS) capabilities.  The Flexible Damage Detection and Verification System...

  11. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    based model checking style of verification. The next paper by D'Souza & Thiagarajan presents an automata-theoretic approach to analysing timing properties of systems. The last paper by Mohalik and Ramanujam presents the assumption.

  12. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  13. Signatures of non-adiabatic dynamics in the fine-structure state distributions of the OH(X{sup ~}/A{sup ~}) products in the B-band photodissociation of H{sub 2}O

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Linsen [Key Laboratory of Mesoscopic Chemistry, School of Chemistry and Chemical Engineering, Institute of Theoretical and Computational Chemistry, Nanjing University, Nanjing 210093 (China); Xie, Daiqian, E-mail: dqxie@nju.edu.cn, E-mail: hguo@unm.edu [Key Laboratory of Mesoscopic Chemistry, School of Chemistry and Chemical Engineering, Institute of Theoretical and Computational Chemistry, Nanjing University, Nanjing 210093 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Guo, Hua, E-mail: dqxie@nju.edu.cn, E-mail: hguo@unm.edu [Department of Chemistry and Chemical Biology, University of New Mexico, Albuquerque, New Mexico 87131 (United States)

    2015-03-28

    A detailed quantum mechanical characterization of the photodissociation dynamics of H{sub 2}O at 121.6 nm is presented. The calculations were performed using a full-dimensional wave packet method on coupled potential energy surfaces of all relevant electronic states. Our state-to-state model permits a detailed analysis of the OH(X{sup ~}/A{sup ~}) product fine-structure populations as a probe of the non-adiabatic dissociation dynamics. The calculated rotational state distributions of the two Λ-doublet levels of OH(X{sup ~}, v = 0) exhibit very different characteristics. The A′ states, produced mostly via the B{sup ~}→X{sup ~} conical intersection pathway, have significantly higher populations than the A″ counterparts, which are primarily from the B{sup ~}→A{sup ~} Renner-Teller pathway. The former features a highly inverted and oscillatory rotational state distribution, while the latter has a smooth distribution with much less rotational excitation. In good agreement with experiment, the calculated total OH(X{sup ~}) rotational state distribution and anisotropy parameters show clear even-odd oscillations, which can be attributed to a quantum mechanical interference between waves emanating from the HOH and HHO conical intersections in the B{sup ~}→X{sup ~} non-adiabatic pathway. On the other hand, the experiment-theory agreement for the OH(A{sup ~}) fragment is also satisfactory, although some small quantitative differences suggest remaining imperfections of the ab initio based potential energy surfaces.

  14. Library stock verification: a ritual and an occupational hazard

    OpenAIRE

    Sridhar, M. S.

    1991-01-01

    Explains the sensitive, controversial stock verification as one of the occupational hazards and a postmortem, emphasises need for clarity of objectives and procedures regarding stock verification and responsibilities of loss, points out that the cost of stock verification often far exceeds the benefits, highlights norms and procedures of stock verification for Government of India institutions, discusses some advantages and various methods and procedures of physical verification, put forth pre...

  15. Signature spectrale des grains interstellaires.

    Science.gov (United States)

    Léger, A.

    Notre connaissance de la nature des grains interstellaires reposait sur un nombre très restreint de signatures spectrales dans la courbe d'extinction du milieu interstellaire. Une information considérable est contenue dans les 40 bandes interstellaires diffuses dans le visible, mais reste inexploitée. L'interprétation récente des cinq bandes IR en émission, en terme de molécules d'hydrocarbures aromatiques polycycliques, est développée. Elle permet l'utilisation d'une information spectroscopique comparable, à elle seule, à ce sur quoi était basée jusqu'alors notre connaissance de la matière interstellaire condensée. Différentes implications de cette mise en évidence sont proposées.

  16. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  17. Optimal Information-Theoretic Wireless Location Verification

    OpenAIRE

    Yan, Shihao; Malaney, Robert; Nevat, Ido; Peters, Gareth W.

    2012-01-01

    We develop a new Location Verification System (LVS) focussed on network-based Intelligent Transport Systems and vehicular ad hoc networks. The algorithm we develop is based on an information-theoretic framework which uses the received signal strength (RSS) from a network of base-stations and the claimed position. Based on this information we derive the optimal decision regarding the verification of the user's location. Our algorithm is optimal in the sense of maximizing the mutual information...

  18. Location Verification Systems Under Spatially Correlated Shadowing

    OpenAIRE

    Yan, Shihao; Nevat, Ido; Peters, Gareth W.; Malaney, Robert

    2014-01-01

    The verification of the location information utilized in wireless communication networks is a subject of growing importance. In this work we formally analyze, for the first time, the performance of a wireless Location Verification System (LVS) under the realistic setting of spatially correlated shadowing. Our analysis illustrates that anticipated levels of correlated shadowing can lead to a dramatic performance improvement of a Received Signal Strength (RSS)-based LVS. We also analyze the per...

  19. Scalable hardware verification with symbolic simulation

    CERN Document Server

    Bertacco, Valeria

    2006-01-01

    An innovative presentation of the theory of disjoint support decomposition, presenting novel results and algorithms, plus original and up-to-date techniques in formal verificationProvides an overview of current verification techniques, and unveils the inner workings of symbolic simulationFocuses on new techniques that narrow the performance gap between the complexity of digital systems and the limited ability to verify themAddresses key topics in need of future research.

  20. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  1. Approaches to Formal Verification of Security Protocols

    OpenAIRE

    Lal, Suvansh; Jain, Mohit; Chaplot, Vikrant

    2011-01-01

    In recent times, many protocols have been proposed to provide security for various information and communication systems. Such protocols must be tested for their functional correctness before they are used in practice. Application of formal methods for verification of security protocols would enhance their reliability thereby, increasing the usability of systems that employ them. Thus, formal verification of security protocols has become a key issue in computer and communications security. In...

  2. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  3. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  4. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Science.gov (United States)

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  5. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  6. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  7. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  8. Prediction of soil effects on GPR signatures

    NARCIS (Netherlands)

    Rhebergen, J.B.; Lensen, H.A.; Wijk, C.V. van; Hendrickx, J.M.H.; Dam, R. van; Borchers, B.

    2004-01-01

    In previous work we have shown that GPR signatures are affected by soil texture and soil water content. In this contribution we will use a three dimensional electromagnetic model and a hydrological soil model to explore in more detail the relationships between GPR signatures, soil physical

  9. Measuring ship acoustic signatures against mine threat

    NARCIS (Netherlands)

    Jong, C.A.F. de; Quesson, B.A.J.; Ainslie, M.A.; Vermeulen, R.C.N.

    2012-01-01

    The NATO standard ‘AMP-15’ [1] provides procedures for the measurement and reporting of the acoustic signature of ships and for the establishment of acoustic signature goals to counter the naval mine threat. Measurements are carried out at dedicated shallow water acoustic ranges. Measurements

  10. The Pedagogic Signature of the Teaching Profession

    Science.gov (United States)

    Kiel, Ewald; Lerche, Thomas; Kollmannsberger, Markus; Oubaid, Viktor; Weiss, Sabine

    2016-01-01

    Lee S. Shulman deplores that the field of education as a profession does not have a pedagogic signature, which he characterizes as a synthesis of cognitive, practical and moral apprenticeship. In this context, the following study has three goals: 1) In the first theoretical part, the basic problems of constructing a pedagogic signature are…

  11. Infrared ship signature analysis and optimisation

    NARCIS (Netherlands)

    Neele, F.P.

    2005-01-01

    The last decade has seen an increase in the awareness of the infrared signature of naval ships. New ship designs show that infrared signature reduction measures are being incorporated, such as exhaust gas cooling systems, relocation of the exhausts and surface cooling systems. Hull and

  12. Real time gamma-ray signature identifier

    Science.gov (United States)

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  13. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    Science.gov (United States)

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  14. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  15. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1999-01-01

    In the post-Cold War period, interest in having the International Atomic Energy Agency (IAEA) use its expertise in support of the arms control and disarmament process has grown. Various pledges by the U.S. and Russian presidents to place former defense materials under some type of international monitoring raises the prospect of using IAEA safeguards concepts and approaches for monitoring these materials, which may include both classified and unclassified materials. Clearly, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA monitoring is the conflict between traditional IAEA materials accounting procedures and various national and international constraints, including U.S. classification laws, and the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Possible 'verification' approaches to classified former defense materials could be based on item accountancy, 'attributes measurements', and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical feasibility or for their possible acceptability in an inspection regime. Substantial work remains in these areas. This paper examines some of the challenges presented by international inspections of classified materials. (author)

  16. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  17. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  18. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  19. A complementary dual-modality verification for tumor tracking on a gimbaled linac system.

    Science.gov (United States)

    Poels, Kenneth; Depuydt, Tom; Verellen, Dirk; Engels, Benedikt; Collen, Christine; Heinrich, Steffen; Duchateau, Michael; Reynders, Truus; Leysen, Katrien; Boussaer, Marlies; Steenbeke, Femke; Tournel, Koen; Gevaert, Thierry; Storme, Guy; De Ridder, Mark

    2013-12-01

    For dynamic tracking of moving tumors, robust intra-fraction verification was required, to assure that tumor motion was properly managed during the course of radiotherapy. A dual-modality verification system, consisting of an on-board orthogonal kV and planar MV imaging device, was validated and applied retrospectively to patient data. Real-time tumor tracking (RTTT) was managed by applying PAN and TILT angular corrections to the therapeutic beam using a gimbaled linac. In this study, orthogonal X-ray imaging and MV EPID fluoroscopy was acquired simultaneously. The tracking beam position was derived from respectively real-time gimbals log files and the detected field outline on EPID. For both imaging modalities, the moving target was localized by detection of an implanted fiducial. The dual-modality tracking verification was validated against a high-precision optical camera in phantom experiments and applied to clinical tracking data from a liver and two lung cancer patients. Both verification modalities showed a high accuracy (tracking showed a 90th percentile error (E90) of 3.45 (liver), 2.44 (lung A) and 3.40 mm (lung B) based on EPID fluoroscopy and good agreement with XR-log file data by an E90 of 3.13, 1.92 and 3.33 mm, respectively, during beam on. Dual-modality verification was successfully implemented, offering the possibility of detailed reporting on RTTT performance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly

    Science.gov (United States)

    Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom

    2013-01-01

    Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.

  1. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  2. Analysis of Radar Doppler Signature from Human Data

    Directory of Open Access Journals (Sweden)

    M. ANDRIĆ

    2014-04-01

    Full Text Available This paper presents the results of time (autocorrelation and time-frequency (spectrogram analyses of radar signals returned from the moving human targets. When a radar signal falls on the human target which is moving toward or away from the radar, the signals reflected from different parts of his body produce a Doppler shift that is proportional to the velocity of those parts. Moving parts of the body causes the characteristic Doppler signature. The main contribution comes from the torso which causes the central Doppler frequency of target. The motion of arms and legs induces modulation on the returned radar signal and generates sidebands around the central Doppler frequency, referred to as micro-Doppler signatures. Through analyses on experimental data it was demonstrated that the human motion signature extraction is better using spectrogram. While the central Doppler frequency can be determined using the autocorrelation and the spectrogram, the extraction of the fundamental cadence frequency using the autocorrelation is unreliable when the target is in the clutter presence. It was shown that the fundamental cadence frequency increases with increasing dynamic movement of people and simultaneously the possibility of its extraction is proportional to the degree of synchronization movements of persons in the group.

  3. Maximizing biomarker discovery by minimizing gene signatures

    Directory of Open Access Journals (Sweden)

    Chang Chang

    2011-12-01

    Full Text Available Abstract Background The use of gene signatures can potentially be of considerable value in the field of clinical diagnosis. However, gene signatures defined with different methods can be quite various even when applied the same disease and the same endpoint. Previous studies have shown that the correct selection of subsets of genes from microarray data is key for the accurate classification of disease phenotypes, and a number of methods have been proposed for the purpose. However, these methods refine the subsets by only considering each single feature, and they do not confirm the association between the genes identified in each gene signature and the phenotype of the disease. We proposed an innovative new method termed Minimize Feature's Size (MFS based on multiple level similarity analyses and association between the genes and disease for breast cancer endpoints by comparing classifier models generated from the second phase of MicroArray Quality Control (MAQC-II, trying to develop effective meta-analysis strategies to transform the MAQC-II signatures into a robust and reliable set of biomarker for clinical applications. Results We analyzed the similarity of the multiple gene signatures in an endpoint and between the two endpoints of breast cancer at probe and gene levels, the results indicate that disease-related genes can be preferably selected as the components of gene signature, and that the gene signatures for the two endpoints could be interchangeable. The minimized signatures were built at probe level by using MFS for each endpoint. By applying the approach, we generated a much smaller set of gene signature with the similar predictive power compared with those gene signatures from MAQC-II. Conclusions Our results indicate that gene signatures of both large and small sizes could perform equally well in clinical applications. Besides, consistency and biological significances can be detected among different gene signatures, reflecting the

  4. Monitoring and verification R&D

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, Joseph F [Los Alamos National Laboratory; Budlong - Sylvester, Kory W [Los Alamos National Laboratory; Fearey, Bryan L [Los Alamos National Laboratory

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  5. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  6. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    Science.gov (United States)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  7. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  8. Dynamic and Energetic Signatures of Adenine Tracts in a rA-dT RNA-DNA Hybrid and in Homologous RNA-DNA, RNA-RNA, and DNA-DNA Double Helices.

    Science.gov (United States)

    Huang, Yuegao; Russu, Irina M

    2017-05-16

    Nuclear magnetic resonance spectroscopy and proton exchange are being used to characterize the opening reactions of individual base pairs in the RNA-DNA hybrid 5'-rGCGAUAAAAAGGCC-3'/5'-dGGCCTTTTTATCGC-3'. The hybrid contains a central tract of five rA-dT base pairs. The rates and the equilibrium constant of the opening reaction for each base pair are determined from the dependence of the exchange rates of imino protons on ammonia concentration, at 10 °C. The results are compared to those previously obtained by our laboratory for three homologous duplexes of the same base sequence (except for the appropriate T/U substitution), containing tracts of dA-rU, rA-rU, or dA-dT base pairs. The rA-dT tract is distinguished by an enhanced propensity of the base pairs to exist in the extrahelical state. The opening rates of rA-dT base pairs also exhibit a strong dependence on the location of the base pair in the structure; namely, as one advances into the tract, the opening rates of rA-dT base pairs gradually decrease. The local stability of each rA-dT base pair within the tract is the same as that of the corresponding rA-rU base pair in the homologous RNA-only duplex but differs from the stabilities of dA-dT and dA-rU base pairs in the other two duplexes (namely, dA-dT > rA-dT > dA-rU). These results demonstrate that, in nucleic acid double helices with the same base sequence, the opening dynamics and the energetics of individual base pairs are strongly influenced by the nature of the strand and by the structural context of the base pair.

  9. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    Science.gov (United States)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  10. Genomic Signatures of Sexual Conflict.

    Science.gov (United States)

    Kasimatis, Katja R; Nelson, Thomas C; Phillips, Patrick C

    2017-10-30

    Sexual conflict is a specific class of intergenomic conflict that describes the reciprocal sex-specific fitness costs generated by antagonistic reproductive interactions. The potential for sexual conflict is an inherent property of having a shared genome between the sexes and, therefore, is an extreme form of an environment-dependent fitness effect. In this way, many of the predictions from environment-dependent selection can be used to formulate expected patterns of genome evolution under sexual conflict. However, the pleiotropic and transmission constraints inherent to having alleles move across sex-specific backgrounds from generation to generation further modulate the anticipated signatures of selection. We outline methods for detecting candidate sexual conflict loci both across and within populations. Additionally, we consider the ability of genome scans to identify sexually antagonistic loci by modeling allele frequency changes within males and females due to a single generation of selection. In particular, we highlight the need to integrate genotype, phenotype, and functional information to truly distinguish sexual conflict from other forms of sexual differentiation. © The American Genetic Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Signature geometry and quantum engineering

    Science.gov (United States)

    Samociuk, Stefan

    2013-09-01

    As the operating frequency of electromagnetic based devices increase, physical design geometry is playing an ever more important role. Evidence is considered in support of a relationship between the dimensionality of primitive geometric forms, such as transistors, and corresponding electromagnetic coupling efficiency. The industry of electronics is defined as the construction of devices by the patterning of primitive forms to physical materials. Examples are given to show the evolution of these primitives, down to nano scales, are requiring exacting geometry and three dimensional content. Consideration of microwave monolithic integrated circuits,(MMIC), photonics and metamaterials,(MM), support this trend and also add new requirements of strict geometric periodicity and multiplicity. Signature geometries,(SG), are characterized by distinctive attributes and examples are given. The transcendent form transcode algorithm, (TTA) is introduced as a multi dimensional SG and its use in designing photonic integrated circuits and metamaterials is discussed . A creative commons licensed research database, TRANSFORM, containing TTA geometries in OASIS file formats is described. An experimental methodology for using the database is given. Multidimensional SG and extraction of three dimensional cross sections as primitive forms is discussed as a foundation for quantum engineering and the exploitation of phenomena other than the electromagnetic.

  12. DIGITAL SIGNATURE IN THE WAY OF LAW

    OpenAIRE

    Ruya Samlı

    2013-01-01

    Signature can be defined as a person’s name or special signs that he/she writes when he/she wants to indicate he/she wrote or confirm that writing. A person signs many times in his/her life. A person’s signature that is used for thousands of times for many things from formal documents to exams has importance for that person. Especially, signing in legal operations is an operation that can build important results. If a person’s signature is imitated by another person, he/she can be...

  13. Arbitrated quantum signature scheme using Bell states

    International Nuclear Information System (INIS)

    Li Qin; Chan, W. H.; Long Dongyang

    2009-01-01

    In an arbitrated quantum signature scheme, the signatory signs the message and the receiver verifies the signature's validity with the assistance of the arbitrator. We present an arbitrated quantum signature scheme using two-particle entangled Bell states similar to the previous scheme using three-particle entangled Greenberger-Horne-Zeilinger states [G. H. Zeng and C. H. Keitel, Phys. Rev. A 65, 042312 (2002)]. The proposed scheme can preserve the merits in the original scheme while providing a higher efficiency in transmission and reducing the complexity of implementation.

  14. Reduction of a Ship's Magnetic Field Signatures

    CERN Document Server

    Holmes, John

    2008-01-01

    Decreasing the magnetic field signature of a naval vessel will reduce its susceptibility to detonating naval influence mines and the probability of a submarine being detected by underwater barriers and maritime patrol aircraft. Both passive and active techniques for reducing the magnetic signatures produced by a vessel's ferromagnetism, roll-induced eddy currents, corrosion-related sources, and stray fields are presented. Mathematical models of simple hull shapes are used to predict the levels of signature reduction that might be achieved through the use of alternate construction materials. Al

  15. Molecular signatures of thyroid follicular neoplasia

    DEFF Research Database (Denmark)

    Borup, R.; Rossing, M.; Henao, Ricardo

    2010-01-01

    The molecular pathways leading to thyroid follicular neoplasia are incompletely understood, and the diagnosis of follicular tumors is a clinical challenge. To provide leads to the pathogenesis and diagnosis of the tumors, we examined the global transcriptome signatures of follicular thyroid...... a mechanism for cancer progression, which is why we exploited the results in order to generate a molecular classifier that could identify 95% of all carcinomas. Validation employing public domain and cross-platform data demonstrated that the signature was robust and could diagnose follicular nodules...... and robust genetic signature for the diagnosis of FA and FC. Endocrine-Related Cancer (2010) 17 691-708...

  16. Criticality meets learning: Criticality signatures in a self-organizing recurrent neural network.

    Science.gov (United States)

    Del Papa, Bruno; Priesemann, Viola; Triesch, Jochen

    2017-01-01

    Many experiments have suggested that the brain operates close to a critical state, based on signatures of criticality such as power-law distributed neuronal avalanches. In neural network models, criticality is a dynamical state that maximizes information processing capacities, e.g. sensitivity to input, dynamical range and storage capacity, which makes it a favorable candidate state for brain function. Although models that self-organize towards a critical state have been proposed, the relation between criticality signatures and learning is still unclear. Here, we investigate signatures of criticality in a self-organizing recurrent neural network (SORN). Investigating criticality in the SORN is of particular interest because it has not been developed to show criticality. Instead, the SORN has been shown to exhibit spatio-temporal pattern learning through a combination of neural plasticity mechanisms and it reproduces a number of biological findings on neural variability and the statistics and fluctuations of synaptic efficacies. We show that, after a transient, the SORN spontaneously self-organizes into a dynamical state that shows criticality signatures comparable to those found in experiments. The plasticity mechanisms are necessary to attain that dynamical state, but not to maintain it. Furthermore, onset of external input transiently changes the slope of the avalanche distributions - matching recent experimental findings. Interestingly, the membrane noise level necessary for the occurrence of the criticality signatures reduces the model's performance in simple learning tasks. Overall, our work shows that the biologically inspired plasticity and homeostasis mechanisms responsible for the SORN's spatio-temporal learning abilities can give rise to criticality signatures in its activity when driven by random input, but these break down under the structured input of short repeating sequences.

  17. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  18. Optical signatures of non-Markovian behavior in open quantum systems

    DEFF Research Database (Denmark)

    McCutcheon, Dara

    2016-01-01

    of the dynamics has observable signatures in the form of phonon sidebands in the resonance fluorescence emission spectrum. Furthermore, we use recently developed non-Markovianity measures to demonstrate an associated flow of information from the phonon bath back into the quantum dot exciton system....

  19. The topographic signature of anthropogenic geomorphic processes

    Science.gov (United States)

    Tarolli, P.; Sofia, G.

    2014-12-01

    Within an abiotic-dominated context, geomorphologic patterns and dynamics are single expressions of trade-offs between the physical resistance forces, and the mechanical and chemical forces related to climate and erosion. Recently, however, it has become essential for the geomorphological community to take into account also biota as a fundamental geomorphologic agent acting from local to regional scales. However, while there is a recent flourishing literature about the impacts of vegetation on geomorphic processes, the study of anthropogenic pressure on geomorphology is still at its early stages. Humans are indeed among the most prominent geomorphic agents, redistributing land surface, and causing drastic changes to the geomorphic organization of the landscape (e.g. intensive agriculture, urbanization), with direct consequences on land degradation and watershed response. The reconstruction or identification of artificial or anthropogenic topographies, therefore, provides a mechanism for quantifying anthropogenic changes to the landscape systems in the context of the Anthropocene epoch. High-resolution topographic data derived from the recent remote sensing technologies (e.g. lidar, SAR, SfM), offer now new opportunities to recognize better understand geomorphic processes from topographic signatures, especially in engineered landscapes where the direct anthropic alteration of processes is significant. It is possible indeed to better recognize human-induced geomorphic and anthropogenic features (e.g. road networks, agricultural terraces), and the connected erosion. The study presented here may allow improved understanding and targeted mitigation of the processes driving geomorphic changes during urban development and help guide future research directions for development-based watershed studies. Human society is deeply affecting the environment with consequences on the landscape. It is therefore fundamental to establish greater management control over the Earth

  20. Simulation and Experimental Validation of Electromagnetic Signatures for Monitoring of Nuclear Material Storage Containers

    Energy Technology Data Exchange (ETDEWEB)

    Aker, Pamela M.; Bunch, Kyle J.; Jones, Anthony M.

    2013-01-01

    Previous research at the Pacific Northwest National Laboratory (PNNL) has demonstrated that the low frequency electromagnetic (EM) response of a sealed metallic container interrogated with an encircling coil is a strong function of its contents and can be used to form a distinct signature which can confirm the presence of specific components without revealing hidden geometry or classified design information. Finite element simulations have recently been performed to further investigate this response for a variety of configurations composed of an encircling coil and a typical nuclear material storage container. Excellent agreement was obtained between simulated and measured impedance signatures of electrically conducting spheres placed inside an AT-400R nuclear container. Simulations were used to determine the effects of excitation frequency and the geometry of the encircling coil, nuclear container, and internal contents. The results show that it is possible to use electromagnetic models to evaluate the application of the EM signature technique to proposed versions of nuclear weapons containers which can accommodate restrictions imposed by international arms control and treaty verification legislation.

  1. Quantum Fully Homomorphic Encryption with Verification

    DEFF Research Database (Denmark)

    Alagic, Gorjan; Dulek, Yfke; Schaffner, Christian

    2017-01-01

    Fully-homomorphic encryption (FHE) enables computation on encrypted data while maintaining secrecy. Recent research has shown that such schemes exist even for quantum computation. Given the numerous applications of classical FHE (zero-knowledge proofs, secure two-party computation, obfuscation, etc.......) it is reasonable to hope that quantum FHE (or QFHE) will lead to many new results in the quantum setting. However, a crucial ingredient in almost all applications of FHE is circuit verification. Classically, verification is performed by checking a transcript of the homomorphic computation. Quantumly, this strategy...... is impossible due to no-cloning. This leads to an important open question: can quantum computations be delegated and verified in a non-interactive manner? In this work, we answer this question in the affirmative, by constructing a scheme for QFHE with verification (vQFHE). Our scheme provides authenticated...

  2. Constraint Specialisation in Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    -down and propagate answer constraints bottom-up. Our approach does not unfold the clauses at all; we use the constraints from the model to compute a specialised version of each clause in the program. The approach is independent of the abstract domain and the constraints theory underlying the clauses. Experimental......We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query-answer transformation of a given set of clauses and a goal. The effect is to propagate the constraints from the goal top...... results on verification problems show that this is an effective transformation, both in our own verification tools (convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  3. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  4. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  5. Blind Quantum Signature with Blind Quantum Computation

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  6. Breakthrough Listen: Searching for Signatures of Technology

    Science.gov (United States)

    Isaacson, H. T.; Siemion, A. P. V.

    2017-11-01

    Breakthrough Listen is searching for signals of extra-terrestrial technologies using radio and optical telescopes. Very nearby stars of all types. Stars across the HR diagram and galaxies are all of interest in the search for techno-signatures.

  7. Magnetic signature surveillance of nuclear fuel

    International Nuclear Information System (INIS)

    Bernatowicz, H.; Schoenig, F.C.

    1981-01-01

    Typical nuclear fuel material contains tramp ferromagnetic particles of random size and distribution. Also, selected amounts of paramagnetic or ferromagnetic material can be added at random or at known positions in the fuel material. The fuel material in its non-magnetic container is scanned along its length by magnetic susceptibility detecting apparatus whereby susceptibility changes along its length are obtained and provide a unique signal waveform of the container of fuel material as a signature thereof. The output signature is stored. At subsequent times in its life the container is again scanned and respective signatures obtained which are compared with the initially obtained signature, any differences indicating alteration or tampering with the fuel material. If the fuel material includes a paramagnetic additive by taking two measurements along the container the effects thereof can be cancelled out. (author)

  8. Magnetic Signature of Brushless Electric Motors

    National Research Council Canada - National Science Library

    Clarke, David

    2006-01-01

    Brushless electric motors are used in a number of underwater vehicles. When these underwater vehicles are used for mine clearance operations the magnetic signature of the brushless motors is important...

  9. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Efurd, D.W.; Rokop, D.J.

    1997-01-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  10. Signature for the shape of the universe

    International Nuclear Information System (INIS)

    Gomero, G.I.; Reboucas, M.J.; Teixeira, A.F.F.

    2001-03-01

    If the universe has a nontrival shape (topology) the sky may show multiple correlated images of cosmic objects. These correlations can be counched in terms of distance correlations. We propose a statistical quantity which can be used to reveal the topological signature of any Roberston-Walker (RW) spacetime with nontrivial topology. We also show through computer-aided simulations how one can extract the topological signatures of flat elliptic and hyperbolic RW universes with nontrivial topology. (author)

  11. Tightly Secure Signatures From Lossy Identification Schemes

    OpenAIRE

    Abdalla , Michel; Fouque , Pierre-Alain; Lyubashevsky , Vadim; Tibouchi , Mehdi

    2015-01-01

    International audience; In this paper, we present three digital signature schemes with tight security reductions in the random oracle model. Our first signature scheme is a particularly efficient version of the short exponent discrete log-based scheme of Girault et al. (J Cryptol 19(4):463–487, 2006). Our scheme has a tight reduction to the decisional short discrete logarithm problem, while still maintaining the non-tight reduction to the computational version of the problem upon which the or...

  12. Testing the local spacetime dynamics by heliospheric radiocommunication methods

    Directory of Open Access Journals (Sweden)

    H.-J. Fahr

    2008-05-01

    Full Text Available According to general relativistic theories, cosmological spacetime is dynamic. This prediction is in excellent agreement with the huge majority of astronomical observations on large cosmic scales, especially the observations of cosmological redshifts of distant galaxies. However, on scales of heliospheric distances, verifications of general relativistic effects are based on Schwarzschild metric tests or kinetical corrections, such as the perihelion motion of Mercury, photon deflection at the Sun and gravitational photon redshifts in central gravity fields. As we will show in this paper, there is, however, a chance to detect new cosmologically relevant features on heliospheric scales by careful study of photon propagations in the local spacetime metrics, based on red- or blueshifts as a clear, but up to now overlooked, signature of the local spacetime dynamics. Thus, we propose the challenging possibility of carrying out experiments of cosmological relevance by simply using high-precision radio tracking of heliospheric spaceprobes, as already practised in cases like Pioneer-10/11, Galileo and Ulysses.

  13. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  14. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  15. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  16. Challenges in High-Assurance Runtime Verification

    Science.gov (United States)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  17. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  18. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  19. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  20. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  1. Effect of verification cores on tip capacity of drilled shafts.

    Science.gov (United States)

    2009-02-01

    This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...

  2. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    Because interlocking systems are highly safety-critical complex systems, their automated safety verification is an active research topic investigated by several groups, employing verification techniques to produce important cost and time savings in their certification. However, such systems also...

  3. Electronic Signatures: They're Legal, Now What?

    Science.gov (United States)

    Broderick, Martha A.; Gibson, Virginia R.; Tarasewich, Peter

    2001-01-01

    In the United States, electronic signatures recently became as legally binding as printed signatures. Reviews the status of electronic signatures in the United States, and compares it to work done by the United Nations. Summarizes the technology that can be used to implement electronic signatures. Discusses problems and open issues surrounding the…

  4. A Black Hole Spectral Signature

    Science.gov (United States)

    Titarchuk, Lev; Laurent, Philippe

    2000-03-01

    An accreting black hole is, by definition, characterized by the drain. Namely, the matter falls into a black hole much the same way as water disappears down a drain matter goes in and nothing comes out. As this can only happen in a black hole, it provides a way to see ``a black hole'', an unique observational signature. The accretion proceeds almost in a free-fall manner close to the black hole horizon, where the strong gravitational field dominates the pressure forces. In this paper we present analytical calculations and Monte-Carlo simulations of the specific features of X-ray spectra formed as a result of upscattering of the soft (disk) photons in the converging inflow (CI) into the black hole. The full relativistic treatment has been implemented to reproduce these spectra. We show that spectra in the soft state of black hole systems (BHS) can be described as the sum of a thermal (disk) component and the convolution of some fraction of this component with the CI upscattering spread (Greens) function. The latter boosted photon component is seen as an extended power-law at energies much higher than the characteristic energy of the soft photons. We demonstrate the stability of the power spectral index over a wide range of the plasma temperature 0 - 10 keV and mass accretion rates (higher than 2 in Eddington units). We also demonstrate that the sharp high energy cutoff occurs at energies of 200-400 keV which are related to the average energy of electrons mec2 impinging upon the event horizon. The spectrum is practically identical to the standard thermal Comptonization spectrum when the CI plasma temperature is getting of order of 50 keV (the typical ones for the hard state of BHS). In this case one can see the effect of the bulk motion only at high energies where there is an excess in the CI spectrum with respect to the pure thermal one. Furthermore we demonstrate that the change of spectral shapes from the soft X-ray state to the hard X-ray state is clearly to be

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  6. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  7. UTEX modeling of xenon signature sensitivity to geology and explosion cavity characteristics following an underground nuclear explosion

    Science.gov (United States)

    Lowrey, J. D.; Haas, D.

    2013-12-01

    Underground nuclear explosions (UNEs) produce anthropogenic isotopes that can potentially be used in the verification component of the Comprehensive Nuclear-Test-Ban Treaty. Several isotopes of radioactive xenon gas have been identified as radionuclides of interest within the International Monitoring System (IMS) and in an On-Site Inspection (OSI). Substantial research has been previously undertaken to characterize the geologic and atmospheric mechanisms that can drive the movement of radionuclide gas from a well-contained UNE, considering both sensitivities on gas arrival time and signature variability of xenon due to the nature of subsurface transport. This work further considers sensitivities of radioxenon gas arrival time and signatures to large variability in geologic stratification and generalized explosion cavity characteristics, as well as compares this influence to variability in the shallow surface.

  8. H–J–B Equations of Optimal Consumption-Investment and Verification Theorems

    International Nuclear Information System (INIS)

    Nagai, Hideo

    2015-01-01

    We consider a consumption-investment problem on infinite time horizon maximizing discounted expected HARA utility for a general incomplete market model. Based on dynamic programming approach we derive the relevant H–J–B equation and study the existence and uniqueness of the solution to the nonlinear partial differential equation. By using the smooth solution we construct the optimal consumption rate and portfolio strategy and then prove the verification theorems under certain general settings

  9. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Min, Chul Hee [Department of Radiological Science, College of Health Science, Yonsei University, Wonju, Kangwon (Korea, Republic of); Testa, Mauro; Winey, Brian [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); Normandin, Marc D. [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); El Fakhri, Georges, E-mail: elfakhri@pet.mgh.harvard.edu [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States)

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  10. Direct modeling parameter signature analysis and failure mode prediction of physical systems using hybrid computer optimization

    Science.gov (United States)

    Drake, R. L.; Duvoisin, P. F.; Asthana, A.; Mather, T. W.

    1971-01-01

    High speed automated identification and design of dynamic systems, both linear and nonlinear, are discussed. Special emphasis is placed on developing hardware and techniques which are applicable to practical problems. The basic modeling experiment and new results are described. Using the improvements developed successful identification of several systems, including a physical example as well as simulated systems, was obtained. The advantages of parameter signature analysis over signal signature analysis in go-no go testing of operational systems were demonstrated. The feasibility of using these ideas in failure mode prediction in operating systems was also investigated. An improved digital controlled nonlinear function generator was developed, de-bugged, and completely documented.

  11. Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption

    Science.gov (United States)

    Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.

    2018-04-01

    This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.

  12. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  13. Does Twitter trigger bursts in signature collections?

    Science.gov (United States)

    Yamaguchi, Rui; Imoto, Seiya; Kami, Masahiro; Watanabe, Kenji; Miyano, Satoru; Yuji, Koichiro

    2013-01-01

    The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78%) of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26%) was smaller than the Forum effect (52%) in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore information hidden in social phenomena.

  14. Does Twitter trigger bursts in signature collections?

    Directory of Open Access Journals (Sweden)

    Rui Yamaguchi

    Full Text Available INTRODUCTION: The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. METHODS AND FINDINGS: In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78% of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26% was smaller than the Forum effect (52% in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. CONCLUSIONS: The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore

  15. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  16. 14 CFR 460.17 - Verification program.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... program. An operator must successfully verify the integrated performance of a vehicle's hardware and any...

  17. Verification Techniques for Graph Rewriting (Tutorial)

    NARCIS (Netherlands)

    Rensink, Arend; Abdulla, Parosh Aziz; Gadducci, Fabio; König, Barbara; Vafeiadis, Viktor

    This tutorial paints a high-level picture of the concepts involved in verification of graph transformation systems. We distinguish three fundamentally different application scenarios for graph rewriting: (1) as grammars (in which case we are interested in the language, or set, of terminal graphs for

  18. Summary 2: Graph Grammar Verification through Abstraction

    NARCIS (Netherlands)

    Baldan, P.; Koenig, B.; Rensink, A.; Rensink, Arend; König, B.; Montanari, U.; Gardner, P.

    2005-01-01

    Until now there have been few contributions concerning the verification of graph grammars, specifically of infinite-state graph grammars. This paper compares two existing approaches, based on abstractions of graph transformation systems. While in the unfolding approach graph grammars are

  19. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    There have been very few mesoscale modelling studies of the Indian monsoon, with focus on the verification and intercomparison of the operational real time forecasts. With the exception of Das et al (2008), most of the studies in the literature are either the case studies of tropical cyclones and thunderstorms or the sensitivity ...

  20. Learner Verification: A Publisher's Case Study.

    Science.gov (United States)

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  1. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. 18 CFR 286.107 - Verification.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... must be sworn to by persons having knowledge thereof, which latter fact must affirmatively appear in...

  3. 18 CFR 349.5 - Verification.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... having knowledge thereof, which latter fact must affirmatively appear in the affidavit. Except under...

  4. Mechanical verification of Lamport's Bakery algorithm

    NARCIS (Netherlands)

    Hesselink, Wim H.

    2013-01-01

    Proof assistants like PVS can be used fruitfully for the design and verification of concurrent algorithms. The technique is presented here by applying it to Lamport's Bakery algorithm. The proofs for safety properties such as mutual exclusion, first-come first-served, and absence of deadlock are

  5. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders Inventory... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... other members of the verification team are accredited by one or more independent and nationally...

  6. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  7. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  8. Runtime Verification for Decentralised and Distributed Systems

    NARCIS (Netherlands)

    Francalanza, Adrian; Pérez, Jorge A.; Sánchez, César; Bartocci, Ezio; Falcone, Yliès

    This chapter surveys runtime verification research related to distributed systems. We report solutions that study how to monitor system with some distributed characteristic, solutions that use a distributed platform for performing a monitoring task, and foundational works that present semantics for

  9. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires a ...

  10. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  11. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  12. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid systems * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  13. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    Model and Notation (BPMN). The automated analysis of business processes is done by means of quantitative probabilistic model checking which allows verification of validation and performance properties through use of an algorithm for the translation of business process models into a format amenable...

  14. 78 FR 58492 - Generator Verification Reliability Standards

    Science.gov (United States)

    2013-09-24

    ... model verifications needed to support reliability and enhance the coordination of generator protection... equipment needed to support Bulk-Power System reliability and enhance coordination of important protection... Generating Unit or Plant Capabilities, Voltage Regulating Controls, and Protection) Develop coordination and...

  15. Experimental signature for statistical multifragmentation

    International Nuclear Information System (INIS)

    Moretto, L.G.; Delis, D.N.; Wozniak, G.J.

    1993-01-01

    Multifragment production was measured for the 60 MeV/nucleon 197 Au+ 27 Al, 51 V, and nat Cu reactions. The branching ratios for binary, ternary, quaternary, and quinary decays were determined as a function of the excitation energy E and are independent of the target. The logarithms of these branching ratios when plotted vs E -1/2 show a linear dependence that strongly suggests a statistical competition between the various multifragmentation channels. This behavior seems to relegate the role of dynamics to the formation of the sources, which then proceed to decay in an apparently statistical manner

  16. Experimental signature for statistical multifragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Moretto, L.G.; Delis, D.N.; Wozniak, G.J. (Nuclear Science Division, Lawrence Berkeley Laboratory, 1 Cyclotron Road, Berkeley, California 94720 (United States))

    1993-12-13

    Multifragment production was measured for the 60 MeV/nucleon [sup 197]Au+[sup 27]Al, [sup 51]V, and [sup nat]Cu reactions. The branching ratios for binary, ternary, quaternary, and quinary decays were determined as a function of the excitation energy [ital E] and are independent of the target. The logarithms of these branching ratios when plotted vs [ital E][sup [minus]1/2] show a linear dependence that strongly suggests a statistical competition between the various multifragmentation channels. This behavior seems to relegate the role of dynamics to the formation of the sources, which then proceed to decay in an apparently statistical manner.

  17. Experimental signature for statistical multifragmentation

    Science.gov (United States)

    Moretto, L. G.; Delis, D. N.; Wozniak, G. J.

    1993-12-01

    Multifragment production was measured for the 60 MeV/nucleon 197Au+27Al, 51V, and natCu reactions. The branching ratios for binary, ternary, quaternary, and quinary decays were determined as a function of the excitation energy E and are independent of the target. The logarithms of these branching ratios when plotted vs E-1/2 show a linear dependence that strongly suggests a statistical competition between the various multifragmentation channels. This behavior seems to relegate the role of dynamics to the formation of the sources, which then proceed to decay in an apparently statistical manner.

  18. Efficient Dynamic Integrity Verification for Big Data Supporting Users Revocability

    Directory of Open Access Journals (Sweden)

    Xinpeng Zhang

    2016-05-01

    Full Text Available With the advent of the big data era, cloud data storage and retrieval have become popular for efficient data management in large companies and organizations, thus they can enjoy the on-demand high-quality cloud storage service. Meanwhile, for security reasons, those companies and organizations would like to verify the integrity of their data once storing it in the cloud. To address this issue, they need a proper cloud storage auditing scheme which matches their actual demands. Current research often focuses on the situation where the data manager owns the data; however, the data belongs to the company, rather than the data managers in the real situation which has been overlooked. For example, the current data manager is no longer suitable to manage the data stored in the cloud after a period and will be replaced by another one. The successor needs to verify the integrity of the former managed data; this problem is obviously inevitable in reality. In this paper, we fill this gap by giving a practical efficient revocable privacy-preserving public auditing scheme for cloud storage meeting the auditing requirement of large companies and organization’s data transfer. The scheme is conceptually simple and is proven to be secure even when the cloud service provider conspires with revoked users.

  19. Using harmonical analysis for experimental verification of reactor dynamics

    International Nuclear Information System (INIS)

    Hrstka, V.

    1974-01-01

    The questions are discussed of the accuracy of the method of static programming when applied to digital harmonic analysis, with regard to the variation of the mean value of the analyzed signals, and to the use of symmetrical trapezoidal periodical signals. The evaluation is made of the suitability of the above-mentioned method in determining the frequency characteristic of the SR-OA reactor. The results obtained were applied to planning the start-up experiments of the KS-150 reactor at the A-1 nuclear power station. (author)

  20. A DYNAMICAL SIGNATURE OF MULTIPLE STELLAR POPULATIONS IN 47 TUCANAE

    International Nuclear Information System (INIS)

    Richer, Harvey B.; Heyl, Jeremy; Anderson, Jay; Kalirai, Jason S.; Shara, Michael M.; Dotter, Aaron; Fahlman, Gregory G.; Rich, R. Michael

    2013-01-01

    Based on the width of its main sequence, and an actual observed split when viewed through particular filters, it is widely accepted that 47 Tucanae contains multiple stellar populations. In this contribution, we divide the main sequence of 47 Tuc into four color groups, which presumably represent stars of various chemical compositions. The kinematic properties of each of these groups are explored via proper motions, and a strong signal emerges of differing proper-motion anisotropies with differing main-sequence color; the bluest main-sequence stars exhibit the largest proper-motion anisotropy which becomes undetectable for the reddest stars. In addition, the bluest stars are also the most centrally concentrated. A similar analysis for Small Magellanic Cloud stars, which are located in the background of 47 Tuc on our frames, yields none of the anisotropy exhibited by the 47 Tuc stars. We discuss implications of these results for possible formation scenarios of the various populations.