WorldWideScience

Sample records for dynamic signature verification

  1. FIR signature verification system characterizing dynamics of handwriting features

    Science.gov (United States)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  2. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  3. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  4. Online Signature Verification on MOBISIG Finger-Drawn Signature Corpus

    Directory of Open Access Journals (Sweden)

    Margit Antal

    2018-01-01

    Full Text Available We present MOBISIG, a pseudosignature dataset containing finger-drawn signatures from 83 users captured with a capacitive touchscreen-based mobile device. The database was captured in three sessions resulting in 45 genuine signatures and 20 skilled forgeries for each user. The database was evaluated by two state-of-the-art methods: a function-based system using local features and a feature-based system using global features. Two types of equal error rate computations are performed: one using a global threshold and the other using user-specific thresholds. The lowest equal error rate was 0.01% against random forgeries and 5.81% against skilled forgeries using user-specific thresholds that were computed a posteriori. However, these equal error rates were significantly raised to 1.68% (random forgeries case and 14.31% (skilled forgeries case using global thresholds. The same evaluation protocol was performed on the DooDB publicly available dataset. Besides verification performance evaluations conducted on the two finger-drawn datasets, we evaluated the quality of the samples and the users of the two datasets using basic quality measures. The results show that finger-drawn signatures can be used by biometric systems with reasonable accuracy.

  5. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  6. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  7. Nonlinear analysis of dynamic signature

    Science.gov (United States)

    Rashidi, S.; Fallah, A.; Towhidkhah, F.

    2013-12-01

    Signature is a long trained motor skill resulting in well combination of segments like strokes and loops. It is a physical manifestation of complex motor processes. The problem, generally stated, is that how relative simplicity in behavior emerges from considerable complexity of perception-action system that produces behavior within an infinitely variable biomechanical and environmental context. To solve this problem, we present evidences which indicate that motor control dynamic in signing process is a chaotic process. This chaotic dynamic may explain a richer array of time series behavior in motor skill of signature. Nonlinear analysis is a powerful approach and suitable tool which seeks for characterizing dynamical systems through concepts such as fractal dimension and Lyapunov exponent. As a result, they can be analyzed in both horizontal and vertical for time series of position and velocity. We observed from the results that noninteger values for the correlation dimension indicates low dimensional deterministic dynamics. This result could be confirmed by using surrogate data tests. We have also used time series to calculate the largest Lyapunov exponent and obtain a positive value. These results constitute significant evidence that signature data are outcome of chaos in a nonlinear dynamical system of motor control.

  8. Online Signature Verification using Recurrent Neural Network and Length-normalized Path Signature

    OpenAIRE

    Lai, Songxuan; Jin, Lianwen; Yang, Weixin

    2017-01-01

    Inspired by the great success of recurrent neural networks (RNNs) in sequential modeling, we introduce a novel RNN system to improve the performance of online signature verification. The training objective is to directly minimize intra-class variations and to push the distances between skilled forgeries and genuine samples above a given threshold. By back-propagating the training signals, our RNN network produced discriminative features with desired metrics. Additionally, we propose a novel d...

  9. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  10. Cubic Bezier Curve Approach for Automated Offline Signature Verification with Intrusion Identification

    Directory of Open Access Journals (Sweden)

    Arun Vijayaragavan

    2014-01-01

    Full Text Available Authentication is a process of identifying person’s rights over a system. Many authentication types are used in various systems, wherein biometrics authentication systems are of a special concern. Signature verification is a basic biometric authentication technique used widely. The signature matching algorithm uses image correlation and graph matching technique which provides false rejection or acceptance. We proposed a model to compare knowledge from signature. Intrusion in the signature repository system results in copy of the signature that leads to false acceptance. Our approach uses a Bezier curve algorithm to identify the curve points and uses the behaviors of the signature for verification. An analyzing mobile agent is used to identify the input signature parameters and compare them with reference signature repository. It identifies duplication of signature over intrusion and rejects it. Experiments are conducted on a database with thousands of signature images from various sources and the results are favorable.

  11. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Science.gov (United States)

    2010-07-22

    ... Electronic Signature and Storage of Form I-9, Employment Eligibility Verification AGENCY: U.S. Immigration... published an interim final rule to permit electronic signature and storage of the Form I-9. 71 FR 34510... because electronic signature and storage technologies are optional, DHS expects that small entities will...

  12. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  13. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  14. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  15. Dynamical Signatures of Living Systems

    Science.gov (United States)

    Zak, M.

    1999-01-01

    One of the main challenges in modeling living systems is to distinguish a random walk of physical origin (for instance, Brownian motions) from those of biological origin and that will constitute the starting point of the proposed approach. As conjectured, the biological random walk must be nonlinear. Indeed, any stochastic Markov process can be described by linear Fokker-Planck equation (or its discretized version), only that type of process has been observed in the inanimate world. However, all such processes always converge to a stable (ergodic or periodic) state, i.e., to the states of a lower complexity and high entropy. At the same time, the evolution of living systems directed toward a higher level of complexity if complexity is associated with a number of structural variations. The simplest way to mimic such a tendency is to incorporate a nonlinearity into the random walk; then the probability evolution will attain the features of diffusion equation: the formation and dissipation of shock waves initiated by small shallow wave disturbances. As a result, the evolution never "dies:" it produces new different configurations which are accompanied by an increase or decrease of entropy (the decrease takes place during formation of shock waves, the increase-during their dissipation). In other words, the evolution can be directed "against the second law of thermodynamics" by forming patterns outside of equilibrium in the probability space. Due to that, a specie is not locked up in a certain pattern of behavior: it still can perform a variety of motions, and only the statistics of these motions is constrained by this pattern. It should be emphasized that such a "twist" is based upon the concept of reflection, i.e., the existence of the self-image (adopted from psychology). The model consists of a generator of stochastic processes which represents the motor dynamics in the form of nonlinear random walks, and a simulator of the nonlinear version of the diffusion

  16. A New Approach for High Pressure Pixel Polar Distribution on Off-line Signature Verification

    Directory of Open Access Journals (Sweden)

    Jesús F. Vargas

    2010-06-01

    Full Text Available Features representing information of High Pressure Points froma static image of a handwritten signature are analyzed for an offline verification system. From grayscale images, a new approach for High Pressure threshold estimation is proposed. Two images, one containingthe High Pressure Points extracted and other with a binary version ofthe original signature, are transformed to polar coordinates where a pixel density ratio between them is calculated. Polar space had been divided into angular and radial segments, which permit a local analysis of the high pressure distribution. Finally two vectors containing the density distribution ratio are calculated for nearest and farthest points from geometric center of the original signature image. Experiments were carried out using a database containing signature from 160 individual. The robustness of the analyzed system for simple forgeries is tested out with Support Vector Machines models. For the sake of completeness, a comparison of the results obtained by the proposed approach with similar works published is presented.

  17. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  18. On the pinned field image binarization for signature generation in image ownership verification method

    Directory of Open Access Journals (Sweden)

    Chang Hsuan

    2011-01-01

    Full Text Available Abstract The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49 (9, 097005, 2010. While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81 (7, 1118-1129, 2008, the proposed scheme also has better performance.

  19. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  20. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  1. Online Signature Verification: To What Extent Should a Classifier be Trusted in?

    Directory of Open Access Journals (Sweden)

    Marianela Parodi

    2017-08-01

    Full Text Available To select the best features to model the signatures is one of the major challenges in the field of online signature verification. To combine different feature sets, selected by different criteria, is a useful technique to address this problem. In this line, the analysis of different features and their discriminative power has been researchers’ main concern, paying less attention to the way in which the different kind of features are combined. Moreover, the fact that conflicting results may appear when several classifiers are being used, has rarely been taken into account. In this paper, a score level fusion scheme is proposed to combine three different and meaningful feature sets, viz., an automatically selected feature set, a feature set relevant to Forensic Handwriting Experts (FHEs, and a global feature set. The score level fusion is performed within the framework of the Belief Function Theory (BFT, in order to address the problem of the conflicting results appearing when multiple classifiers are being used. Two different models, namely, the Denoeux and the Appriou models, are used to embed the problem within this framework, where the fusion is performed resorting to two well-known combination rules, namely, the Dempster-Shafer (DS and the Proportional Conflict Redistribution (PCR5 one. In order to analyze the robustness of the proposed score level fusion approach, the combination is performed for the same verification system using two different classification techniques, namely, Ramdon Forests (RF and Support Vector Machines (SVM. Experimental results, on a publicly available database, show that the proposed score level fusion approach allows the system to have a very good trade-off between verification results and reliability.

  2. Multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2017-11-01

    Full Text Available Reliable authentication in mobile applications is among the most important information security challenges. Today, we can hardly imagine a person who would not own a mobile device that connects to the Internet. Mobile devices are being used to store large amounts of confidential information, ranging from personal photos to electronic banking tools. In 2009, colleagues from Rice University together with their collaborators from Motorola, proposed an authentication through in-air gestures. This and subsequent work contributing to the development of the method are reviewed in our introduction. At the moment, there exists a version of the gesture-based authentication software available for Android mobile devices. This software has not become widespread yet. One of likely reasons for that is the insufficient reliability of the method, which involves similar to its earlier analogs the use of only one device. Here we discuss the authentication based on the multimodal three-dimensional dynamic signature (MTDS performed by two independent mobile devices. The MTDS-based authentication technique is an advanced version of in-air gesture authentication. We describe the operation of a prototype of MTDS-based authentication, including the main implemented algorithms, as well as some preliminary results of testing the software. We expect that our method can be used in any mobile application, provided a number of additional improvements discussed in the conclusion are made.

  3. On-power verification of the dynamic response of self-powered in-core detectors

    International Nuclear Information System (INIS)

    Serdula, K.; Beaudet, M.

    1996-01-01

    Self-powered in-core detectors are used for on-line safety and regulation purposes in CANDU reactors. Such applications require use of detectors whose response is primarily prompt to changes in flux. In-service verification of the detectors' response is required to ensure significant degradation in performance has not occurred during long-term operation. Changes in the detector characteristics occur due to nuclear interactions and failures. Present verification requires significant station resources and disrupts power production. Use of the 'noise' in the detector signal is being investigated as an alternative to assess the dynamic response of the detectors during long-term operation. Measurements of reference 'signatures' were obtained from replacement shutdown system detectors. Results show 'noise' measurements are a promising alternative to the current verification method. Identification of changes in the detector response function assist in accurate diagnosis and prognosis of changes in detector signals due to process changes. (author)

  4. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  5. Dynamic signature of molecular association in methanol

    International Nuclear Information System (INIS)

    Bertrand, C. E.; Copley, J. R. D.; Faraone, A.; Self, J. L.

    2016-01-01

    Quasielastic neutron scattering measurements and molecular dynamics simulations were combined to investigate the collective dynamics of deuterated methanol, CD 3 OD. In the experimentally determined dynamic structure factor, a slow, non-Fickian mode was observed in addition to the standard density-fluctuation heat mode. The simulation results indicate that the slow dynamical process originates from the hydrogen bonding of methanol molecules. The qualitative behavior of this mode is similar to the previously observed α-relaxation in supercooled water [M. C. Bellissent-Funel et al., Phys. Rev. Lett. 85, 3644 (2000)] which also originates from the formation and dissolution of hydrogen-bonded associates (supramolecular clusters). In methanol, however, this mode is distinguishable well above the freezing transition. This finding indicates that an emergent slow mode is not unique to supercooled water, but may instead be a general feature of hydrogen-bonding liquids and associating molecular liquids.

  6. Research on Signature Verification Method Based on Discrete Fréchet Distance

    Science.gov (United States)

    Fang, J. L.; Wu, W.

    2018-05-01

    This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.

  7. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  8. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  9. Early signatures of regime shifts in gene expression dynamics

    Science.gov (United States)

    Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani

    2013-06-01

    Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence.

  10. Early signatures of regime shifts in gene expression dynamics

    International Nuclear Information System (INIS)

    Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani

    2013-01-01

    Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence. (paper)

  11. Dynamic signatures of driven vortex motion.

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, G. W.; Kwok, W. K.; Lopez, D.; Olsson, R. J.; Paulius, L. M.; Petrean, A. M.; Safar, H.

    1999-09-16

    We probe the dynamic nature of driven vortex motion in superconductors with a new type of transport experiment. An inhomogeneous Lorentz driving force is applied to the sample, inducing vortex velocity gradients that distinguish the hydrodynamic motion of the vortex liquid from the elastic and-plastic motion of the vortex solid. We observe elastic depinning of the vortex lattice at the critical current, and shear induced plastic slip of the lattice at high Lorentz force gradients.

  12. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  13. A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.

    Science.gov (United States)

    Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce

    2018-01-01

    A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification. © 2017 American Academy of Forensic Sciences.

  14. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  15. MERGER SIGNATURES IN THE DYNAMICS OF STAR-FORMING GAS

    International Nuclear Information System (INIS)

    Hung, Chao-Ling; Sanders, D. B.; Hayward, Christopher C.; Smith, Howard A.; Ashby, Matthew L. N.; Martínez-Galarza, Juan R.; Zezas, Andreas; Lanz, Lauranne

    2016-01-01

    The recent advent of integral field spectrographs and millimeter interferometers has revealed the internal dynamics of many hundreds of star-forming galaxies. Spatially resolved kinematics have been used to determine the dynamical status of star-forming galaxies with ambiguous morphologies, and constrain the importance of galaxy interactions during the assembly of galaxies. However, measuring the importance of interactions or galaxy merger rates requires knowledge of the systematics in kinematic diagnostics and the visible time with merger indicators. We analyze the dynamics of star-forming gas in a set of binary merger hydrodynamic simulations with stellar mass ratios of 1:1 and 1:4. We find that the evolution of kinematic asymmetries traced by star-forming gas mirrors morphological asymmetries derived from mock optical images, in which both merger indicators show the largest deviation from isolated disks during strong interaction phases. Based on a series of simulations with various initial disk orientations, orbital parameters, gas fractions, and mass ratios, we find that the merger signatures are visible for ∼0.2–0.4 Gyr with kinematic merger indicators but can be approximately twice as long for equal-mass mergers of massive gas-rich disk galaxies designed to be analogs of z ∼ 2–3 submillimeter galaxies. Merger signatures are most apparent after the second passage and before the black holes coalescence, but in some cases they persist up to several hundred Myr after coalescence. About 20%–60% of the simulated galaxies are not identified as mergers during the strong interaction phase, implying that galaxies undergoing violent merging process do not necessarily exhibit highly asymmetric kinematics in their star-forming gas. The lack of identifiable merger signatures in this population can lead to an underestimation of merger abundances in star-forming galaxies, and including them in samples of star-forming disks may bias the measurements of disk

  16. Solar wind dynamic pressure variations and transient magnetospheric signatures

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Baumjohann, W.

    1989-01-01

    Contrary to the prevailing popular view, we find some transient ground events with bipolar north-south signatures are related to variations in solar wind dynamic pressure and not necessarily to magnetic merging. We present simultaneous solar wind plasma observations for two previously reported transient ground events observed at dayside auroral latitudes. During the first event, originally reported by Lanzerotti et al. [1987], conjugate ground magnetometers recorded north-south magetic field deflections in the east-west and vertical directions. The second event was reported by Todd et al. [1986], we noted ground rader observations indicating strong northward then southward ionospheric flows. The events were associated with the postulated signatures of patchy, sporadic, merging of magnetosheath and magnetospheric magnetic field lines at the dayside magnetospause, known as flux transfer events. Conversely, we demonstrate that the event reported by Lanzerotti et al. was accompanied by a sharp increase in solar wind dynamic pressure, a magnetospheric compression, and a consequent ringing of the magnetospheric magnetic field. The event reported by Todd et al. was associated with a brief but sharp increase in the solar wind dynamic pressure. copyright American Geophysical Union 1989

  17. Massive Black Hole Binaries: Dynamical Evolution and Observational Signatures

    Directory of Open Access Journals (Sweden)

    M. Dotti

    2012-01-01

    Full Text Available The study of the dynamical evolution of massive black hole pairs in mergers is crucial in the context of a hierarchical galaxy formation scenario. The timescales for the formation and the coalescence of black hole binaries are still poorly constrained, resulting in large uncertainties in the expected rate of massive black hole binaries detectable in the electromagnetic and gravitational wave spectra. Here, we review the current theoretical understanding of the black hole pairing in galaxy mergers, with a particular attention to recent developments and open issues. We conclude with a review of the expected observational signatures of massive binaries and of the candidates discussed in literature to date.

  18. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  19. Signatures of discrete breathers in coherent state quantum dynamics

    International Nuclear Information System (INIS)

    Igumenshchev, Kirill; Ovchinnikov, Misha; Prezhdo, Oleg; Maniadis, Panagiotis

    2013-01-01

    In classical mechanics, discrete breathers (DBs) – a spatial time-periodic localization of energy – are predicted in a large variety of nonlinear systems. Motivated by a conceptual bridging of the DB phenomena in classical and quantum mechanical representations, we study their signatures in the dynamics of a quantum equivalent of a classical mechanical point in phase space – a coherent state. In contrast to the classical point that exhibits either delocalized or localized motion, the coherent state shows signatures of both localized and delocalized behavior. The transition from normal to local modes have different characteristics in quantum and classical perspectives. Here, we get an insight into the connection between classical and quantum perspectives by analyzing the decomposition of the coherent state into system's eigenstates, and analyzing the spacial distribution of the wave-function density within these eigenstates. We find that the delocalized and localized eigenvalue components of the coherent state are separated by a mixed region, where both kinds of behavior can be observed. Further analysis leads to the following observations. Considered as a function of coupling, energy eigenstates go through avoided crossings between tunneling and non-tunneling modes. The dominance of tunneling modes in the high nonlinearity region is compromised by the appearance of new types of modes – high order tunneling modes – that are similar to the tunneling modes but have attributes of non-tunneling modes. Certain types of excitations preferentially excite higher order tunneling modes, allowing one to study their properties. Since auto-correlation functions decrease quickly in highly nonlinear systems, short-time dynamics are sufficient for modeling quantum DBs. This work provides a foundation for implementing modern semi-classical methods to model quantum DBs, bridging classical and quantum mechanical signatures of DBs, and understanding spectroscopic experiments

  20. The Importance of Hydrological Signature and Its Recurring Dynamics

    Science.gov (United States)

    Wendi, D.; Marwan, N.; Merz, B.

    2017-12-01

    Temporal changes in hydrology are known to be challenging to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, and land use change, could impact variably on space-time scales and influence or mask each other. Besides, data depicting these drivers are often not available. One conventional approach of analyzing the change is based on discrete points of magnitude (e.g. the frequency of recurring extreme discharge) and often linearly quantified and hence do not reveal the potential change in the hydrological process. Moreover, discharge series are often subject to measurement errors, such as rating curve error especially in the case of flood peaks where observation are derived through extrapolation. In this study, the system dynamics inferred from the hydrological signature (i.e. the shape of hydrograph) is being emphasized. One example is to see if certain flood dynamics (instead of flood peak) in the recent years, had also occurred in the past (or rather extraordinary), and if so what is its recurring rate and if there had been a shift in its occurrence in time or seasonality (e.g. earlier snow melt dominant flood). The utilization of hydrological signature here is extended beyond those of classical hydrology such as base flow index, recession and rising limb slope, and time to peak. It is in fact all these characteristics combined i.e. from the start until the end of the hydrograph. Recurrence plot is used as a method to quantify and visualize the recurring hydrological signature through its phase space trajectories, and usually in the order of dimension above 2. Such phase space trajectories are constructed by embedding the time series into a series of variables (i.e. number of dimension) corresponding to the time delay. Since the method is rather novel in

  1. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, SJ.

    2005-01-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3 D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings

  2. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  3. Pen and platen, piezo-electric (Engineering Materials). [Signature verification for access to restricted areas

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  4. Pen and platen, piezo-electric (21 Aug 1978) (Engineering Materials). [Signature verification

    Energy Technology Data Exchange (ETDEWEB)

    The set of five drawings defines a writing instrument system that will reliably verify signatures, thus providing a method useful in screening persons seeking entrance to restricted areas or access to computer programs. Using a conventional ballpoint pen refill, the instrument's input derives from signals generated in its writing tip and from pressure exerted by a person writing his name or a code word on the platen (tablet). The basic principle is that accelerations of the writing tip and pressures exerted by the person writing are recorded in three axes. This combination of signals can be processed by a computer and compared with a record in the computer's memory, or a graphic transcription may be compared visually with an earlier record.

  5. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes

    International Nuclear Information System (INIS)

    Pestana, Rafael Cardoso Baptistini

    2013-01-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO 2 F 2 and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the 235 U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope 235 U, as well as the use of reprocessed material, through the identification of the 236 U isotope. The uncertainties obtained for the n( 235 U)/n( 238 U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  6. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  7. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, S.J.; University of Newcastle, NSW

    2004-01-01

    Full text: Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. The enhanced dynamic wedge factor (EDWF) presents some significant problems in accurate MU calculation, particularly in the case of non centre of field position (COF). This paper describes development of an independent MU program, concentrating on the implementation of the EDW component. The difficult case of non COF points under the EDW was studied in detail. A survey of Australasian centres regarding the use of independent MU check systems was conducted. The MUCalculator was developed with reference to MU calculations made by Pinnacle 3D RTP system (Philips) for 4MV, 6MV and 18MV X-ray beams from Varian machines used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. Ionisation chamber measurements in solid water TM and liquid water were performed based on a published test data set. Published algorithms combined with a depth dependent profile correction were applied in an attempt to match measured data with maximum accuracy. The survey results are presented. Substantial data is presented in tabular form and extensive comparison with published data. Several different methods for calculating EDWF are examined. A small systematic error was detected in the Gibbon equation used for the EDW calculations. Generally, calculations were within +2% of measured values, although some setups exceeded this variation. Results indicate that COF

  8. Possibilities of dynamic biometrics for authentication and the circumstances for using dynamic biometric signature

    Directory of Open Access Journals (Sweden)

    Frantisek Hortai

    2018-01-01

    Full Text Available New information technologies alongside their benefits also bring new dangers with themselves. It is difficult to decide which authentication tool to use and implement in the information systems and electronic documents. The final decision has to compromise among the facts that it faces several conflicting requirements: highly secure tool, to be a user-friendly and user simplicity method, ensure protection against errors and failures of users, speed of authentication and provide these features for a reasonable price. Even when the compromised solution is found it has to fulfill the given technology standards. For the listed reasons the paper argues one of the most natural biometric authentication method the dynamic biometric signature and lists its related standards. The paper also includes measurement evaluation which solves the independence between the person’s signature and device on which it was created

  9. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  10. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  11. Dynamic thermal signature prediction for real-time scene generation

    Science.gov (United States)

    Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.; Swierkowski, Leszek

    2013-05-01

    At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.

  12. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  13. A Directed Signature Scheme and its Applications

    OpenAIRE

    Lal, Sunder; Kumar, Manoj

    2004-01-01

    This paper presents a directed signature scheme with the property that the signature can be verified only with the help of signer or signature receiver. We also propose its applications to share verification of signatures and to threshold cryptosystems.

  14. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  15. Universal Nonequilibrium Signatures of Majorana Zero Modes in Quench Dynamics

    Directory of Open Access Journals (Sweden)

    R. Vasseur

    2014-10-01

    Full Text Available The quantum evolution that occurs after a metallic lead is suddenly connected to an electron system contains information about the excitation spectrum of the combined system. We exploit this type of “quantum quench” to probe the presence of Majorana fermions at the ends of a topological superconducting wire. We obtain an algebraically decaying overlap (Loschmidt echo L(t=|⟨ψ(0|ψ(t⟩|^{2}∼t^{-α} for large times after the quench, with a universal critical exponent α=1/4 that is found to be remarkably robust against details of the setup, such as interactions in the normal lead, the existence of additional lead channels, or the presence of bound levels between the lead and the superconductor. As in recent quantum-dot experiments, this exponent could be measured by optical absorption, offering a new signature of Majorana zero modes that is distinct from interferometry and tunneling spectroscopy.

  16. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  17. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  18. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  19. A Signature Comparing Android Mobile Application Utilizing Feature Extracting Algorithms

    Directory of Open Access Journals (Sweden)

    Paul Grafilon

    2017-08-01

    Full Text Available The paper presented one of the application that can be done using smartphones camera. Nowadays forgery is one of the most undetected crimes. With the forensic technology used today it is still difficult for authorities to compare and define what a real signature is and what a forged signature is. A signature is a legal representation of a person. All transactions are based on a signature. Forgers may use a signature to sign illegal contracts and withdraw from bank accounts undetected. A signature can also be forged during election periods for repeated voting. Addressing the issues a signature should always be secure. Signature verification is a reduced problem that still poses a real challenge for researchers. The literature on signature verification is quite extensive and shows two main areas of research off-line and on-line systems. Off-line systems deal with a static image of the signature i.e. the result of the action of signing while on-line systems work on the dynamic process of generating the signature i.e. the action of signing itself. The researchers have found a way to resolve the concerns. A mobile application that integrates the camera to take a picture of a signature analyzes it and compares it to other signatures for verification. It will exist to help citizens to be more cautious and aware with issues regarding the signatures. This might also be relevant to help organizations and institutions such as banks and insurance companies in verifying signatures that may avoid unwanted transactions and identity theft. Furthermore this might help the authorities in the never ending battle against crime especially against forgers and thieves. The project aimed to design and develop a mobile application that integrates the smartphone camera for verifying and comparing signatures for security using the best algorithm possible. As the result of the development the said smartphone camera application is functional and reliable.

  20. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  1. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  2. A Signature of Inflation from Dynamical Supersymmetry Breaking

    CERN Document Server

    Kinney, W H; Kinney, William H.; Riotto, Antonio

    1998-01-01

    In models of cosmological inflation motivated by dynamical supersymmetry breaking, the potential driving inflation may be characterized by inverse powers of a scalar field. These models produce observables similar to those typical of the hybrid inflation scenario: negligible production of tensor (gravitational wave) modes, and a blue scalar spectral index. In this short note, we show that, unlike standard hybrid inflation models, dynamical supersymmetric inflation (DSI) predicts a measurable deviation from a power-law spectrum of fluctuations, with a variation in the scalar spectral index $|dn / d(\\ln k)|$ may be as large as 0.05. DSI can be observationally distinguished from other hybrid models with cosmic microwave background measurements of the planned sensitivity of the ESA's Planck Surveyor.

  3. Designing dynamically "signature business model" that support durable competitive advantage

    OpenAIRE

    Čirjevskis, Andrejs

    2016-01-01

    Purpose/Research question: The paper provides an empirical research of the Samsung case. In particular, we study the case by adopting three frameworks: dynamic capabilities (DC, examined by using the sensing/seizing/transforming approach), business model (BM, examined by using the BM canvas), and customer value proposition (CVP), examined by using the PERFA ((Performance, Ease of use, Reliability, Flexibility, and Affectivity) framework. The aim is to demonstrate that three frameworks success...

  4. Forged Signature Distinction Using Convolutional Neural Network for Feature Extraction

    Directory of Open Access Journals (Sweden)

    Seungsoo Nam

    2018-01-01

    Full Text Available This paper proposes a dynamic verification scheme for finger-drawn signatures in smartphones. As a dynamic feature, the movement of a smartphone is recorded with accelerometer sensors in the smartphone, in addition to the moving coordinates of the signature. To extract high-level longitudinal and topological features, the proposed scheme uses a convolution neural network (CNN for feature extraction, and not as a conventional classifier. We assume that a CNN trained with forged signatures can extract effective features (called S-vector, which are common in forging activities such as hesitation and delay before drawing the complicated part. The proposed scheme also exploits an autoencoder (AE as a classifier, and the S-vector is used as the input vector to the AE. An AE has high accuracy for the one-class distinction problem such as signature verification, and is also greatly dependent on the accuracy of input data. S-vector is valuable as the input of AE, and, consequently, could lead to improved verification accuracy especially for distinguishing forged signatures. Compared to the previous work, i.e., the MLP-based finger-drawn signature verification scheme, the proposed scheme decreases the equal error rate by 13.7%, specifically, from 18.1% to 4.4%, for discriminating forged signatures.

  5. The electronic identification, signature and security of information systems

    Directory of Open Access Journals (Sweden)

    Horovèák Pavel

    2002-12-01

    Full Text Available The contribution deals with the actual methods and technologies of information and communication systems security. It introduces the overview of electronic identification elements such as static password, dynamic password and single sign-on. Into this category belong also biometric and dynamic characteristics of verified person. Widespread is authentication based on identification elements ownership, such as various cards and authentication calculators. In the next part is specified a definition and characterization of electronic signature, its basic functions and certificate categories. Practical utilization of electronic signature consists of electronic signature acquirement, signature of outgoing email message, receiving of electronic signature and verification of electronic signature. The use of electronic signature is continuously growing and in connection with legislation development it exercises in all resorts.

  6. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  7. Dynamics of railway bridges, analysis and verification by field tests

    Directory of Open Access Journals (Sweden)

    Andersson Andreas

    2015-01-01

    Full Text Available The following paper discusses different aspects of railway bridge dynamics, comprising analysis, modelling procedures and experimental testing. The importance of realistic models is discussed, especially regarding boundary conditions, load distribution and soil-structure interaction. Two theoretical case studies are presented, involving both deterministic and probabilistic assessment of a large number of railway bridges using simplified and computationally efficient models. A total of four experimental case studies are also introduced, illustrating different aspects and phenomena in bridge dynamics. The excitation consists of both ambient vibrations, train induced vibrations, free vibrations after train passages and controlled forced excitation.

  8. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  9. Signatures of Indistinguishability in Bosonic Many-Body Dynamics

    Science.gov (United States)

    Brünner, Tobias; Dufour, Gabriel; Rodríguez, Alberto; Buchleitner, Andreas

    2018-05-01

    The dynamics of bosons in generic multimode systems, such as Bose-Hubbard models, are not only determined by interactions among the particles, but also by their mutual indistinguishability manifested in many-particle interference. We introduce a measure of indistinguishability for Fock states of bosons whose mutual distinguishability is controlled by an internal degree of freedom. We demonstrate how this measure emerges both in the noninteracting and interacting evolution of observables. In particular, we find an unambiguous relationship between our measure and the variance of single-particle observables in the noninteracting limit. A nonvanishing interaction leads to a hierarchy of interaction-induced interference processes, such that even the expectation value of single-particle observables is influenced by the degree of indistinguishability.

  10. Issues in computational fluid dynamics code verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Blottner, F.G.

    1997-09-01

    A broad range of mathematical modeling errors of fluid flow physics and numerical approximation errors are addressed in computational fluid dynamics (CFD). It is strongly believed that if CFD is to have a major impact on the design of engineering hardware and flight systems, the level of confidence in complex simulations must substantially improve. To better understand the present limitations of CFD simulations, a wide variety of physical modeling, discretization, and solution errors are identified and discussed. Here, discretization and solution errors refer to all errors caused by conversion of the original partial differential, or integral, conservation equations representing the physical process, to algebraic equations and their solution on a computer. The impact of boundary conditions on the solution of the partial differential equations and their discrete representation will also be discussed. Throughout the article, clear distinctions are made between the analytical mathematical models of fluid dynamics and the numerical models. Lax`s Equivalence Theorem and its frailties in practical CFD solutions are pointed out. Distinctions are also made between the existence and uniqueness of solutions to the partial differential equations as opposed to the discrete equations. Two techniques are briefly discussed for the detection and quantification of certain types of discretization and grid resolution errors.

  11. Dynamic simulator for nuclear power plants (DSNP): development, verification, and expansion of modules

    International Nuclear Information System (INIS)

    Larson, H.A.; Dean, E.M.; Koenig, J.F.; Gale, J.G.; Lehto, W.K.

    1984-01-01

    The DSNP Simulation Language facilitates whole reactor plant simulation and design. Verification includes DSNP dynamic modeling of Experimental Breeder Reactor No. 2 (EBR-II) plant experiments as well as comparisons with verified simulation programs. Great flexibility is allowed in expanding the DSNP language and accommodate other computer languages. The component modules of DSNP, contained in libraries, are continually updated with new, improved, and verified modules. The modules are used to simulate the dynamic response of LMFBR reactor systems to upset and transient conditions, with special emphasis on investigations of inherent shutdown mechanisms

  12. Far-IR transparency and dynamic infrared signature control with novel conducting polymer systems

    Science.gov (United States)

    Chandrasekhar, Prasanna; Dooley, T. J.

    1995-09-01

    Materials which possess transparency, coupled with active controllability of this transparency in the infrared (IR), are today an increasingly important requirement, for varied applications. These applications include windows for IR sensors, IR-region flat panel displays used in camouflage as well as in communication and sight through night-vision goggles, coatings with dynamically controllable IR-emissivity, and thermal conservation coatings. Among stringent requirements for these applications are large dynamic ranges (color contrast), 'multi-color' or broad-band characteristics, extended cyclability, long memory retention, matrix addressability, small area fabricability, low power consumption, and environmental stability. Among materials possessing the requirements for variation of IR signature, conducting polymers (CPs) appear to be the only materials with dynamic, actively controllable signature and acceptable dynamic range. Conventional CPs such as poly(alkyl thiophene), poly(pyrrole) or poly(aniline) show very limited dynamic range, especially in the far-IR, while also showing poor transparency. We have developed a number of novel CP systems ('system' implying the CP, the selected dopant, the synthesis method, and the electrolyte) with very wide dynamic range (up to 90% in both important IR regions, 3 - 5 (mu) and 8 - 12 (mu) ), high cyclability (to 105 cycles with less than 10% optical degradation), nearly indefinite optical memory retention, matrix addressability of multi-pixel displays, very wide operating temperature and excellent environmental stability, low charge capacity, and processability into areas from less than 1 mm2 to more than 100 cm2. The criteria used to design and arrive at these CP systems, together with representative IR signature data, are presented in this paper.

  13. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  14. Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequences and Doppler Signatures.

    Science.gov (United States)

    Zhou, Zhi; Cao, Zongjie; Pi, Yiming

    2017-12-21

    The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar.

  15. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  16. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  17. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  18. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  19. Approaches to determining the reliability of a multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2018-03-01

    Full Text Available The market of modern mobile applications has increasingly strict requirements for the authentication system reliability. This article examines an authentication method using a multimodal three-dimensional dynamic signature (MTDS, that can be used both as a main and additional method of user authentication in mobile applications. It is based on the use of gesture in the air performed by two independent mobile devices as an identifier. The MTDS method has certain advantages over currently used biometric methods, including fingerprint authentication, face recognition and voice recognition. A multimodal three-dimensional dynamic signature allows quickly changing an authentication gesture, as well as concealing the authentication procedure using gestures that do not attract attention. Despite all its advantages, the MTDS method has certain limitations, the main one is building functionally dynamic complex (FDC skills required for accurate repeating an authentication gesture. To correctly create MTDS need to have a system for assessing the reliability of gestures. Approaches to the solution of this task are grouped in this article according to methods of their implementation. Two of the approaches can be implemented only with the use of a server as a centralized MTDS processing center and one approach can be implemented using smartphone's own computing resources. The final part of the article provides data of testing one of these methods on a template performing the MTDS authentication.

  20. Single Molecule Cluster Analysis Identifies Signature Dynamic Conformations along the Splicing Pathway

    Science.gov (United States)

    Blanco, Mario R.; Martin, Joshua S.; Kahlscheuer, Matthew L.; Krishnan, Ramya; Abelson, John; Laederach, Alain; Walter, Nils G.

    2016-01-01

    The spliceosome is the dynamic RNA-protein machine responsible for faithfully splicing introns from precursor messenger RNAs (pre-mRNAs). Many of the dynamic processes required for the proper assembly, catalytic activation, and disassembly of the spliceosome as it acts on its pre-mRNA substrate remain poorly understood, a challenge that persists for many biomolecular machines. Here, we developed a fluorescence-based Single Molecule Cluster Analysis (SiMCAn) tool to dissect the manifold conformational dynamics of a pre-mRNA through the splicing cycle. By clustering common dynamic behaviors derived from selectively blocked splicing reactions, SiMCAn was able to identify signature conformations and dynamic behaviors of multiple ATP-dependent intermediates. In addition, it identified a conformation adopted late in splicing by a 3′ splice site mutant, invoking a mechanism for substrate proofreading. SiMCAn presents a novel framework for interpreting complex single molecule behaviors that should prove widely useful for the comprehensive analysis of a plethora of dynamic cellular machines. PMID:26414013

  1. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  2. Characterizing the anthropogenic signature in the LCLU dynamics in the Central Asia region

    Science.gov (United States)

    Tatarskii, V.; Sokolik, I. N.; de Beurs, K.; Shiklomanov, A. I.

    2017-12-01

    Humans have been changing the LCLU dynamics over time through the world. In the Central Asia region, these changes have been especially pronounced due to the political and economic transformation. We present a detailed analysis, focusing on identifying and quantifying the anthropogenic signature in the water and land use across the region. We have characterized the anthropogenic dust emission by combining the modeling and observations. The model is a fully coupled model called WRF-Chem-DuMo that takes explicitly into account the vegetation treatment in modeling the dust emission. We have reconstructed the anthropogenic dust sources in the region, such as the retreat of the Aral Sea, changes in agricultural fields, etc. In addition, we characterize the anthropogenic water use dynamics, including the changes in the water use for the agricultural production. Furthermore, we perform an analysis to identify the anthropogenic signature in the NDVI pattern. The NDVI were analyzed in conjunction with the meteorological fields that were simulated at the high special resolution using the WRF model. Meteorological fields of precipitation and temperature were used for the correlation analysis to separate the natural vs. anthropogenic changes. In this manner, we were able to identify the regions that have been affected by human activities. We will present the quantitative assessment of the anthropogenic changes. The diverse consequences for the economy of the region, as well as, the environment will be addressed.

  3. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans

    Science.gov (United States)

    Michelmann, Sebastian; Bowman, Howard; Hanslmayr, Simon

    2016-01-01

    Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz) power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans. PMID:27494601

  4. The Temporal Signature of Memories: Identification of a General Mechanism for Dynamic Memory Replay in Humans.

    Directory of Open Access Journals (Sweden)

    Sebastian Michelmann

    2016-08-01

    Full Text Available Reinstatement of dynamic memories requires the replay of neural patterns that unfold over time in a similar manner as during perception. However, little is known about the mechanisms that guide such a temporally structured replay in humans, because previous studies used either unsuitable methods or paradigms to address this question. Here, we overcome these limitations by developing a new analysis method to detect the replay of temporal patterns in a paradigm that requires participants to mentally replay short sound or video clips. We show that memory reinstatement is accompanied by a decrease of low-frequency (8 Hz power, which carries a temporal phase signature of the replayed stimulus. These replay effects were evident in the visual as well as in the auditory domain and were localized to sensory-specific regions. These results suggest low-frequency phase to be a domain-general mechanism that orchestrates dynamic memory replay in humans.

  5. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  6. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    Purpose: To verify the accuracy of conformal isodose distributions and absolute doses delivered with a dynamic IMR system. Methods and materials: 21 patients treated with advanced or recurrent disease with a dynamic IMR system, of which 13 were immobilized with head screws, and 8, with non-invasive plastic masks. The system included immobilization techniques, computerized tomography (CT), a dynamic pencil beam multileaf collimator (MLC), a collimator controller computer, collimator safety interlocks, a simulated annealing optimization implemented on a dedicated quad processing computer system, phantoms embedded with dosemeters, patient setup and dose delivery techniques, in vivo dose verification, and a comprehensive quality assurance program. The collimator consisted of a 2 x 20 array of Tungsten leaves, each programmable to be either fully open or shut, thus offering 2 40 beam patterns with cross sectional areas of up to 4 x 20 cm at the linear accelerator (linac) gantry rotational axis. Any of these patterns were dynamically changeable per degree sign of gantry rotation. An anthropomorphic phantom composed of transverse anatomic slabs helped simulate patient geometry relative to immobilization devices, fiducial markers, CT and treatment room lasers, and linac rotational axis. Before each treatment regimen, the compliance of measured to planned doses was tested in phantom irradiations using each patient's fiducial markers, immobilization system, anatomic positioning, and collimator sequencing. Films and thermoluminescent dosemeters (TLD) were embedded in the phantom to measure absolute doses and dose distributions. Because the planner didn't account for variable electron density distributions in head and neck targets, the air cavities of the anthropomorphic phantom were filled with tissue equivalent bolus. Optical density distributions of films exposed to the dynamic IMR of each patient were obtained with a Hurter-Driffield calibration curved based on films

  7. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  8. Constrained structural dynamic model verification using free vehicle suspension testing methods

    Science.gov (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  9. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  10. Atomic-scale structural signature of dynamic heterogeneities in metallic liquids

    Science.gov (United States)

    Pasturel, Alain; Jakse, Noel

    2017-08-01

    With sufficiently high cooling rates, liquids will cross their equilibrium melting temperatures and can be maintained in a metastable undercooled state before solidifying. Studies of undercooled liquids reveal several intriguing dynamic phenomena and because explicit connections between liquid structure and liquids dynamics are difficult to identify, it remains a major challenge to capture the underlying structural link to these phenomena. Ab initio molecular dynamics (AIMD) simulations are yet especially powerful in providing atomic-scale details otherwise not accessible in experiments. Through the AIMD-based study of Cr additions in Al-based liquids, we evidence for the first time a close relationship between the decoupling of component diffusion and the emergence of dynamic heterogeneities in the undercooling regime. In addition, we demonstrate that the origin of both phenomena is related to a structural heterogeneity caused by a strong interplay between chemical short-range order (CSRO) and local fivefold topology (ISRO) at the short-range scale in the liquid phase that develops into an icosahedral-based medium-range order (IMRO) upon undercooling. Finally, our findings reveal that this structural signature is also captured in the temperature dependence of partial pair-distribution functions which opens up the route to more elaborated experimental studies.

  11. Verification of experimental modal modeling using HDR (Heissdampfreaktor) dynamic test data

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1983-01-01

    Experimental modal modeling involves the determination of the modal parameters of the model of a structure from recorded input-output data from dynamic tests. Though commercial modal analysis algorithms are being widely used in many industries their ability to identify a set of reliable modal parameters of an as-built nuclear power plant structure has not been systematically verified. This paper describes the effort to verify MODAL-PLUS, a widely used modal analysis code, using recorded data from the dynamic tests performed on the reactor building of the Heissdampfreaktor, situated near Frankfurt, Federal Republic of Germany. In the series of dynamic tests on HDR in 1979, the reactor building was subjected to forced vibrations from different types and levels of dynamic excitations. Two sets of HDR containment building input-output data were chosen for MODAL-PLUS analyses. To reduce the influence of nonlinear behavior on the results, these sets were chosen so that the levels of excitation are relatively low and about the same in the two sets. The attempted verification was only partially successful in that only one modal model, with a limited range of validity, could be synthesized and in that the goodness of fit could be verified only in this limited range

  12. Studies on plant dynamics of sodium-cooled fast breeder reactors - verification of a plant model

    International Nuclear Information System (INIS)

    Schubert, B.

    1988-01-01

    For the analysis of sodium-cooled FBR safety and dynamics theoretical models are used, which have to be verified. In this report the verification of the plant model SSC-L is conducted by the comparison of calculated data with measurements of the experimental reactors KNK II and RAPSODIE. For this the plant model is extended and adapted. In general only small differences between calculated and measured data are recognized. The results are used to improve and complete the plant model. The extensions of the plant model applicability are used for the calculation of a loss of heat sink transient with reactor scram, considering pipes as passive heat sinks. (orig./HP) With 69 figs., 10 tabs [de

  13. Dynamical signatures of isometric force control as a function of age, expertise, and task constraints.

    Science.gov (United States)

    Vieluf, Solveig; Sleimen-Malkoun, Rita; Voelcker-Rehage, Claudia; Jirsa, Viktor; Reuter, Eva-Maria; Godde, Ben; Temprado, Jean-Jacques; Huys, Raoul

    2017-07-01

    From the conceptual and methodological framework of the dynamical systems approach, force control results from complex interactions of various subsystems yielding observable behavioral fluctuations, which comprise both deterministic (predictable) and stochastic (noise-like) dynamical components. Here, we investigated these components contributing to the observed variability in force control in groups of participants differing in age and expertise level. To this aim, young (18-25 yr) as well as late middle-aged (55-65 yr) novices and experts (precision mechanics) performed a force maintenance and a force modulation task. Results showed that whereas the amplitude of force variability did not differ across groups in the maintenance tasks, in the modulation task it was higher for late middle-aged novices than for experts and higher for both these groups than for young participants. Within both tasks and for all groups, stochastic fluctuations were lowest where the deterministic influence was smallest. However, although all groups showed similar dynamics underlying force control in the maintenance task, a group effect was found for deterministic and stochastic fluctuations in the modulation task. The latter findings imply that both components were involved in the observed group differences in the variability of force fluctuations in the modulation task. These findings suggest that between groups the general characteristics of the dynamics do not differ in either task and that force control is more affected by age than by expertise. However, expertise seems to counteract some of the age effects. NEW & NOTEWORTHY Stochastic and deterministic dynamical components contribute to force production. Dynamical signatures differ between force maintenance and cyclic force modulation tasks but hardly between age and expertise groups. Differences in both stochastic and deterministic components are associated with group differences in behavioral variability, and observed behavioral

  14. Impact of seaweed beachings on dynamics of δ15N isotopic signatures in marine macroalgae

    International Nuclear Information System (INIS)

    Lemesle, Stéphanie; Mussio, Isabelle; Rusig, Anne-Marie; Menet-Nédélec, Florence; Claquin, Pascal

    2015-01-01

    Highlights: • Two coastal sites (COU, GM) in the Bay of Seine affected by summer seaweed beachings. • The same temporal dynamics of the algal δ 15 N at the two sites. • N and P concentrations in seawater of the two sites dominated by riverine sources. • A coupling between seaweed beachings and N sources of intertidal macroalgae. - Abstract: A fine-scale survey of δ 15 N, δ 13 C, tissue-N in seaweeds was conducted using samples from 17 sampling points at two sites (Grandcamp-Maisy (GM), Courseulles/Mer (COU)) along the French coast of the English Channel in 2012 and 2013. Partial triadic analysis was performed on the parameter data sets and revealed the functioning of three areas: one estuary (EstA) and two rocky areas (GM ∗ , COU ∗ ). In contrast to oceanic and anthropogenic reference points similar temporal dynamics characterized δ 15 N signatures and N contents at GM ∗ and COU ∗ . Nutrient dynamics were similar: the N-concentrations in seawater originated from the River Seine and local coastal rivers while P-concentrations mainly from these local rivers. δ 15 N at GM ∗ were linked to turbidity suggesting inputs of autochthonous organic matter from large-scale summer seaweed beachings made up of a mixture of Rhodophyta, Phaeophyta and Chlorophyta species. This study highlights the coupling between seaweed beachings and nitrogen sources of intertidal macroalgae

  15. Thin accretion disk signatures in dynamical Chern-Simons-modified gravity

    International Nuclear Information System (INIS)

    Harko, Tiberiu; Kovacs, Zoltan; Lobo, Francisco S N

    2010-01-01

    A promising extension of general relativity is Chern-Simons (CS)-modified gravity, in which the Einstein-Hilbert action is modified by adding a parity-violating CS term, which couples to gravity via a scalar field. In this work, we consider the interesting, yet relatively unexplored, dynamical formulation of CS-modified gravity, where the CS coupling field is treated as a dynamical field, endowed with its own stress-energy tensor and evolution equation. We consider the possibility of observationally testing dynamical CS-modified gravity by using the accretion disk properties around slowly rotating black holes. The energy flux, temperature distribution, the emission spectrum as well as the energy conversion efficiency are obtained, and compared to the standard general relativistic Kerr solution. It is shown that the Kerr black hole provides a more efficient engine for the transformation of the energy of the accreting mass into radiation than their slowly rotating counterparts in CS-modified gravity. Specific signatures appear in the electromagnetic spectrum, thus leading to the possibility of directly testing CS-modified gravity by using astrophysical observations of the emission spectra from accretion disks.

  16. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification.

    Science.gov (United States)

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm(2). Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm(2) and compared with ion chamber data. Scanditronix/Wellhofer OmniPro(TM) IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm(2) at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles

  17. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    Science.gov (United States)

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  18. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  19. Biological signatures of dynamic river networks from a coupled landscape evolution and neutral community model

    Science.gov (United States)

    Stokes, M.; Perron, J. T.

    2017-12-01

    Freshwater systems host exceptionally species-rich communities whose spatial structure is dictated by the topology of the river networks they inhabit. Over geologic time, river networks are dynamic; drainage basins shrink and grow, and river capture establishes new connections between previously separated regions. It has been hypothesized that these changes in river network structure influence the evolution of life by exchanging and isolating species, perhaps boosting biodiversity in the process. However, no general model exists to predict the evolutionary consequences of landscape change. We couple a neutral community model of freshwater organisms to a landscape evolution model in which the river network undergoes drainage divide migration and repeated river capture. Neutral community models are macro-ecological models that include stochastic speciation and dispersal to produce realistic patterns of biodiversity. We explore the consequences of three modes of speciation - point mutation, time-protracted, and vicariant (geographic) speciation - by tracking patterns of diversity in time and comparing the final result to an equilibrium solution of the neutral model on the final landscape. Under point mutation, a simple model of stochastic and instantaneous speciation, the results are identical to the equilibrium solution and indicate the dominance of the species-area relationship in forming patterns of diversity. The number of species in a basin is proportional to its area, and regional species richness reaches its maximum when drainage area is evenly distributed among sub-basins. Time-protracted speciation is also modeled as a stochastic process, but in order to produce more realistic rates of diversification, speciation is not assumed to be instantaneous. Rather, each new species must persist for a certain amount of time before it is considered to be established. When vicariance (geographic speciation) is included, there is a transient signature of increased

  20. Impact of seaweed beachings on dynamics of δ(15)N isotopic signatures in marine macroalgae.

    Science.gov (United States)

    Lemesle, Stéphanie; Mussio, Isabelle; Rusig, Anne-Marie; Menet-Nédélec, Florence; Claquin, Pascal

    2015-08-15

    A fine-scale survey of δ(15)N, δ(13)C, tissue-N in seaweeds was conducted using samples from 17 sampling points at two sites (Grandcamp-Maisy (GM), Courseulles/Mer (COU)) along the French coast of the English Channel in 2012 and 2013. Partial triadic analysis was performed on the parameter data sets and revealed the functioning of three areas: one estuary (EstA) and two rocky areas (GM(∗), COU(∗)). In contrast to oceanic and anthropogenic reference points similar temporal dynamics characterized δ(15)N signatures and N contents at GM(∗) and COU(∗). Nutrient dynamics were similar: the N-concentrations in seawater originated from the River Seine and local coastal rivers while P-concentrations mainly from these local rivers. δ(15)N at GM(∗) were linked to turbidity suggesting inputs of autochthonous organic matter from large-scale summer seaweed beachings made up of a mixture of Rhodophyta, Phaeophyta and Chlorophyta species. This study highlights the coupling between seaweed beachings and nitrogen sources of intertidal macroalgae. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  2. Major urinary protein (MUP) profiles show dynamic changes rather than individual ‘barcode’ signatures

    Science.gov (United States)

    Thoß, M.; Luzynski, K.C.; Ante, M.; Miller, I.; Penn, D.J.

    2016-01-01

    House mice (Mus musculus) produce a variable number of major urinary proteins (MUPs), and studies suggest that each individual produces a unique MUP profile that provides a distinctive odor signature controlling individual and kin recognition. This ‘barcode hypothesis’ requires that MUP urinary profiles show high individual variability within populations and also high individual consistency over time, but tests of these assumptions are lacking. We analyzed urinary MUP profiles of 66 wild-caught house mice from eight populations using isoelectric focusing. We found that MUP profiles of wild male house mice are not individually unique, and though they were highly variable, closer inspection revealed that the variation strongly depended on MUP band type. The prominent (‘major) bands were surprisingly homogenous (and hence most MUPs are not polymorphic), but we also found inconspicuous (‘minor’) bands that were highly variable and therefore potential candidates for individual fingerprints. We also examined changes in urinary MUP profiles of 58 males over time (from 6 to 24 weeks of age), and found that individual MUP profiles and MUP concentration were surprisingly dynamic, and showed significant changes after puberty and during adulthood. Contrary to what we expected, however, the minor bands were the most variable over time, thus no good candidates for individual fingerprints. Although MUP profiles do not provide individual fingerprints, we found that MUP profiles were more similar among siblings than non-kin despite considerable fluctuation. Our findings show that MUP profiles are not highly stable over time, they do not show strong individual clustering, and thus challenge the barcode hypothesis. Within-individual dynamics of MUP profiles indicate a different function of MUPs in individual recognition than previously assumed and advocate an alternative hypothesis (‘dynamic changes’ hypothesis). PMID:26973837

  3. Major urinary protein (MUP) profiles show dynamic changes rather than individual 'barcode' signatures.

    Science.gov (United States)

    Thoß, M; Luzynski, K C; Ante, M; Miller, I; Penn, D J

    2015-06-30

    House mice ( Mus musculus) produce a variable number of major urinary proteins (MUPs), and studies suggest that each individual produces a unique MUP profile that provides a distinctive odor signature controlling individual and kin recognition. This 'barcode hypothesis' requires that MUP urinary profiles show high individual variability within populations and also high individual consistency over time, but tests of these assumptions are lacking. We analyzed urinary MUP profiles of 66 wild-caught house mice from eight populations using isoelectric focusing. We found that MUP profiles of wild male house mice are not individually unique, and though they were highly variable, closer inspection revealed that the variation strongly depended on MUP band type. The prominent ('major) bands were surprisingly homogenous (and hence most MUPs are not polymorphic), but we also found inconspicuous ('minor') bands that were highly variable and therefore potential candidates for individual fingerprints. We also examined changes in urinary MUP profiles of 58 males over time (from 6 to 24 weeks of age), and found that individual MUP profiles and MUP concentration were surprisingly dynamic, and showed significant changes after puberty and during adulthood. Contrary to what we expected, however, the minor bands were the most variable over time, thus no good candidates for individual fingerprints. Although MUP profiles do not provide individual fingerprints, we found that MUP profiles were more similar among siblings than non-kin despite considerable fluctuation. Our findings show that MUP profiles are not highly stable over time, they do not show strong individual clustering, and thus challenge the barcode hypothesis. Within-individual dynamics of MUP profiles indicate a different function of MUPs in individual recognition than previously assumed and advocate an alternative hypothesis ('dynamic changes' hypothesis).

  4. Dynamic oscillatory signatures of central neuropathic pain in spinal cord injury.

    Science.gov (United States)

    Vuckovic, Aleksandra; Hasan, Muhammad A; Fraser, Matthew; Conway, Bernard A; Nasseroleslami, Bahman; Allan, David B

    2014-06-01

    Central neuropathic pain (CNP) is believed to be accompanied by increased activation of the sensorimotor cortex. Our knowledge of this interaction is based mainly on functional magnetic resonance imaging studies, but there is little direct evidence on how these changes manifest in terms of dynamic neuronal activity. This study reports on the presence of transient electroencephalography (EEG)-based measures of brain activity during motor imagery in spinal cord-injured patients with CNP. We analyzed dynamic EEG responses during imaginary movements of arms and legs in 3 groups of 10 volunteers each, comprising able-bodied people, paraplegic patients with CNP (lower abdomen and legs), and paraplegic patients without CNP. Paraplegic patients with CNP had increased event-related desynchronization in the theta, alpha, and beta bands (16-24 Hz) during imagination of movement of both nonpainful (arms) and painful limbs (legs). Compared to patients with CNP, paraplegics with no pain showed a much reduced power in relaxed state and reduced event-related desynchronization during imagination of movement. Understanding these complex dynamic, frequency-specific activations in CNP in the absence of nociceptive stimuli could inform the design of interventional therapies for patients with CNP and possibly further understanding of the mechanisms involved. This study compares the EEG activity of spinal cord-injured patients with CNP to that of spinal cord-injured patients with no pain and also to that of able-bodied people. The study shows that the presence of CNP itself leads to frequency-specific EEG signatures that could be used to monitor CNP and inform neuromodulatory treatments of this type of pain. Copyright © 2014 American Pain Society. Published by Elsevier Inc. All rights reserved.

  5. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  6. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    Science.gov (United States)

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  7. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and Discretization

    Directory of Open Access Journals (Sweden)

    Wai Kuan Yip

    2007-01-01

    Full Text Available We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT and discrete fourier transform (DFT. Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs of and for random and skilled forgeries for stolen token (worst case scenario, and for both forgeries in the genuine token (optimal scenario.

  8. Analysis of signature wrapping attacks and countermeasures

    DEFF Research Database (Denmark)

    Gajek, Sebastian; Jensen, Meiko; Liao, Lijun

    2009-01-01

    In recent research it turned out that Boolean verification, of digital signatures in the context of WSSecurity, is likely to fail: If parts of a SOAP message, are signed and the signature verification applied to, the whole document returns true, then nevertheless the, document may have been...

  9. A signature of attractor dynamics in the CA3 region of the hippocampus.

    Directory of Open Access Journals (Sweden)

    César Rennó-Costa

    2014-05-01

    Full Text Available The notion of attractor networks is the leading hypothesis for how associative memories are stored and recalled. A defining anatomical feature of such networks is excitatory recurrent connections. These "attract" the firing pattern of the network to a stored pattern, even when the external input is incomplete (pattern completion. The CA3 region of the hippocampus has been postulated to be such an attractor network; however, the experimental evidence has been ambiguous, leading to the suggestion that CA3 is not an attractor network. In order to resolve this controversy and to better understand how CA3 functions, we simulated CA3 and its input structures. In our simulation, we could reproduce critical experimental results and establish the criteria for identifying attractor properties. Notably, under conditions in which there is continuous input, the output should be "attracted" to a stored pattern. However, contrary to previous expectations, as a pattern is gradually "morphed" from one stored pattern to another, a sharp transition between output patterns is not expected. The observed firing patterns of CA3 meet these criteria and can be quantitatively accounted for by our model. Notably, as morphing proceeds, the activity pattern in the dentate gyrus changes; in contrast, the activity pattern in the downstream CA3 network is attracted to a stored pattern and thus undergoes little change. We furthermore show that other aspects of the observed firing patterns can be explained by learning that occurs during behavioral testing. The CA3 thus displays both the learning and recall signatures of an attractor network. These observations, taken together with existing anatomical and behavioral evidence, make the strong case that CA3 constructs associative memories based on attractor dynamics.

  10. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  11. Studying the potential of point detectors in time-resolved dose verification of dynamic radiotherapy

    International Nuclear Information System (INIS)

    Beierholm, A.R.; Behrens, C.F.; Andersen, C.E.

    2015-01-01

    Modern megavoltage x-ray radiotherapy with high spatial and temporal dose gradients puts high demands on the entire delivery system, including not just the linear accelerator and the multi-leaf collimator, but also algorithms used for optimization and dose calculations, and detectors used for quality assurance and dose verification. In this context, traceable in-phantom dosimetry using a well-characterized point detector is often an important supplement to 2D-based quality assurance methods based on radiochromic film or detector arrays. In this study, an in-house developed dosimetry system based on fiber-coupled plastic scintillator detectors was evaluated and compared with a Farmer-type ionization chamber and a small-volume ionization chamber. An important feature of scintillator detectors is that the sensitive volume of the detector can easily be scaled, and five scintillator detectors of different scintillator length were thus employed to quantify volume averaging effects by direct measurement. The dosimetric evaluation comprised several complex-shape static fields as well as simplified dynamic deliveries using RapidArc, a volumetric-modulated arc therapy modality often used at the participating clinic. The static field experiments showed that the smallest scintillator detectors were in the best agreement with dose calculations, while needing the smallest volume averaging corrections. Concerning total dose measured during RapidArc, all detectors agreed with dose calculations within 1.1 ± 0.7% when positioned in regions of high homogenous dose. Larger differences were observed for high dose gradient and organ at risk locations, were differences between measured and calculated dose were as large as 8.0 ± 5.5%. The smallest differences were generally seen for the small-volume ionization chamber and the smallest scintillators. The time-resolved RapidArc dose profiles revealed volume-dependent discrepancies between scintillator and ionization chamber response

  12. Additional signature of the dynamical Casimir effect in a superconducting circuit

    International Nuclear Information System (INIS)

    Rego, Andreson L.C.; Farina, C.; Silva, Hector O.; Alves, Danilo T.

    2013-01-01

    Full text: The dynamical Casimir effect (DCE) is one of the most fascinating quantum vacuum effects that consists, essentially, on the particle creation as a result of the interaction between a quantized field and a moving mirror. In this sense, particle creation due to external time-dependent potentials or backgrounds, or even time dependent electromagnetic properties of a material medium can also be included in a general definition of DCE. For simplicity, this interaction is simulated, in general, by means of idealized boundary conditions (BC). As a consequence of the particle creation, the moving mirror experiences a dissipative radiation reaction force acting on it. In order to generate an appreciable number of photons to be observed, the DCE was investigated in other contexts, as for example, in the circuit quantum electrodynamics. This theory predicted high photon creation rate by the modulation of the length of an open transmission line coupled to a superconducting quantum interference device (SQUID), an extremely sensitive magnetometer (J.R. Johansson et al, 2009/2010). A time dependent magnetic flux can be applied to the SQUID changing its inductance, leading to a time-dependent BC which simulates a moving boundary It was in the last scenario that the first observation of the DCE was announced by Wilson and collaborators (Wilson et al, 2011). Taking as motivation the experiment that observed the DCE, we investigate the influence of the generalized time-dependent Robin BC, that presents an extra term involving the second order time derivative of the field, in the particle creation via DCE. This kind of BC may appear quite naturally in the context of circuit quantum electrodynamics and the extra term was neglected in the theoretical aspects of the first observation of the DCE. Appropriate adjustments of this new parameter can not only enhance the total number of created particles but also give rise to a non-parabolic shape of the particle creation spectral

  13. EPID-based verification of the MLC performance for dynamic IMRT and VMAT

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; Barnes, Michael P.; O’Connor, Daryl J.; Greer, Peter B.

    2012-01-01

    Purpose: In advanced radiotherapy treatments such as intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), verification of the performance of the multileaf collimator (MLC) is an essential part of the linac QA program. The purpose of this study is to use the existing measurement methods for geometric QA of the MLCs and extend them to more comprehensive evaluation techniques, and to develop dedicated robust algorithms to quantitatively investigate the MLC performance in a fast, accurate, and efficient manner. Methods: The behavior of leaves was investigated in the step-and-shoot mode by the analysis of integrated electronic portal imaging device (EPID) images acquired during picket fence tests at fixed gantry angles and arc delivery. The MLC was also studied in dynamic mode by the analysis of cine EPID images of a sliding gap pattern delivered in a variety of conditions including different leaf speeds, deliveries at fixed gantry angles or in arc mode, and changing the direction of leaf motion. The accuracy of the method was tested by detection of the intentionally inserted errors in the delivery patterns. Results: The algorithm developed for the picket fence analysis was able to find each individual leaf position, gap width, and leaf bank skewness in addition to the deviations from expected leaf positions with respect to the beam central axis with sub-pixel accuracy. For the three tested linacs over a period of 5 months, the maximum change in the gap width was 0.5 mm, the maximum deviation from the expected leaf positions was 0.1 mm and the MLC skewness was up to 0.2°. The algorithm developed for the sliding gap analysis could determine the velocity and acceleration/deceleration of each individual leaf as well as the gap width. There was a slight decrease in the accuracy of leaf performance with increasing leaf speeds. The analysis results were presented through several graphs. The accuracy of the method was assessed as 0.01 mm

  14. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry

    International Nuclear Information System (INIS)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-01-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  15. Dosimetric parameters of enhanced dynamic wedge for treatment planning and verification

    International Nuclear Information System (INIS)

    Leavitt, Dennis D.; Lee, Wing Lok; Gaffney, David K.

    1996-01-01

    Purpose/Objective: Enhanced Dynamic Wedge (EDW) is an intensity-modulated radiotherapy technique in which one collimating jaw sweeps across the field to define a desired wedge dose distribution while dose rate is modified according to jaw position. This tool enables discrete or continuous wedge angles from zero to sixty degrees for field widths from three cm to 30 cm in the direction of the wedge, and up to 40 cm perpendicular to the wedge direction. Additionally, asymmetric wedge fields not centered on the line through isocenter can be created for applications such as tangential breast irradiation. The unique range of field shapes and wedge angles introduce a new set of dosimetric challenges to be resolved before routine clinical use of EDW, and especially require that a simple set of independent dose calculation and verification techniques be developed to check computerized treatment planning results. Using terminology in common use in treatment planning, this work defines the effective wedge factor vs. field width and wedge angle, evaluates the depth dose vs. open field values, defines primary intensity functions from which specific dynamic wedges can be calculated in treatment planning systems, and describes the technique for independent calculation of Monitor Units for EDW fields. Materials and Methods: Using 6- and 18-MV beams from a CI2100C, EDW beam profiles were measured in water phantom for depths from near-surface to 30 cm for the full range of field widths and wedge angles using a linear detector array of 25 energy-compensated diodes. Asymmetric wedge field profiles were likewise measured. Depth doses were measured in water phantom using an ionization chamber sequentially positioned to depths of 30 cm. Effective wedge factors for the full range of field widths and wedge angles were measured using an ionization chamber in water-equivalent plastic at a depth of 10 cm on central axis. Dose profiles were calculated by computer as the summation of a series

  16. Signature detection and matching for document image retrieval.

    Science.gov (United States)

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.

  17. Nitrate denitrification with nitrite or nitrous oxide as intermediate products: Stoichiometry, kinetics and dynamics of stable isotope signatures.

    Science.gov (United States)

    Vavilin, V A; Rytov, S V

    2015-09-01

    A kinetic analysis of nitrate denitrification by a single or two species of denitrifying bacteria with glucose or ethanol as a carbon source and nitrite or nitrous oxide as intermediate products was performed using experimental data published earlier (Menyailo and Hungate, 2006; Vidal-Gavilan et al., 2013). Modified Monod kinetics was used in the dynamic biological model. The special equations were added to the common dynamic biological model to describe how isotopic fractionation between N species changes. In contrast to the generally assumed first-order kinetics, in this paper, the traditional Rayleigh equation describing stable nitrogen and oxygen isotope fractionation in nitrate was derived from the dynamic isotopic equations for any type of kinetics. In accordance with the model, in Vidal-Gavilan's experiments, the maximum specific rate of nitrate reduction was proved to be less for ethanol compared to glucose. Conversely, the maximum specific rate of nitrite reduction was proved to be much less for glucose compared to ethanol. Thus, the intermediate nitrite concentration was negligible for the ethanol experiment, while it was significant for the glucose experiment. In Menyailo's and Hungate's experiments, the low value of maximum specific rate of nitrous oxide reduction gives high intermediate value of nitrous oxide concentration. The model showed that the dynamics of nitrogen and oxygen isotope signatures are responding to the biological dynamics. Two microbial species instead of single denitrifying bacteria are proved to be more adequate to describe the total process of nitrate denitrification to dinitrogen. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Observational Signatures of Transverse Magnetohydrodynamic Waves and Associated Dynamic Instabilities in Coronal Flux Tubes

    Energy Technology Data Exchange (ETDEWEB)

    Antolin, P.; Moortel, I. De [School of Mathematics and Statistics, University of St. Andrews, St. Andrews, Fife KY16 9SS (United Kingdom); Doorsselaere, T. Van [Centre for mathematical Plasma Astrophysics, Mathematics Department, KU Leuven, Celestijnenlaan 200B bus 2400, B-3001 Leuven (Belgium); Yokoyama, T., E-mail: patrick.antolin@st-andrews.ac.uk [Department of Earth and Planetary Science, The University of Tokyo, Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2017-02-20

    Magnetohydrodynamic (MHD) waves permeate the solar atmosphere and constitute potential coronal heating agents. Yet, the waves detected so far may be but a small subset of the true existing wave power. Detection is limited by instrumental constraints but also by wave processes that localize the wave power in undetectable spatial scales. In this study, we conduct 3D MHD simulations and forward modeling of standing transverse MHD waves in coronal loops with uniform and non-uniform temperature variation in the perpendicular cross-section. The observed signatures are largely dominated by the combination of the Kelvin–Helmholtz instability (KHI), resonant absorption, and phase mixing. In the presence of a cross-loop temperature gradient, we find that emission lines sensitive to the loop core catch different signatures compared to those that are more sensitive to the loop boundary and the surrounding corona, leading to an out-of-phase intensity and Doppler velocity modulation produced by KHI mixing. In all of the considered models, common signatures include an intensity and loop width modulation at half the kink period, a fine strand-like structure, a characteristic arrow-shaped structure in the Doppler maps, and overall line broadening in time but particularly at the loop edges. For our model, most of these features can be captured with a spatial resolution of 0.″33 and a spectral resolution of 25 km s{sup −1}, although we do obtain severe over-estimation of the line width. Resonant absorption leads to a significant decrease of the observed kinetic energy from Doppler motions over time, which is not recovered by a corresponding increase in the line width from phase mixing and KHI motions. We estimate this hidden wave energy to be a factor of 5–10 of the observed value.

  19. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  20. Dosimetric properties of an amorphous silicon electronic portal imaging device for verification of dynamic intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Greer, Peter B.; Popescu, Carmen C.

    2003-01-01

    Dosimetric properties of an amorphous silicon electronic portal imaging device (EPID) for verification of dynamic intensity modulated radiation therapy (IMRT) delivery were investigated. The EPID was utilized with continuous frame-averaging during the beam delivery. Properties studied included effect of buildup, dose linearity, field size response, sampling of rapid multileaf collimator (MLC) leaf speeds, response to dose-rate fluctuations, memory effect, and reproducibility. The dependence of response on EPID calibration and a dead time in image frame acquisition occurring every 64 frames were measured. EPID measurements were also compared to ion chamber and film for open and wedged static fields and IMRT fields. The EPID was linear with dose and dose rate, and response to MLC leaf speeds up to 2.5 cm s-1 was found to be linear. A field size dependent response of up to 5% relative to d max ion-chamber measurement was found. Reproducibility was within 0.8% (1 standard deviation) for an IMRT delivery recorded at intervals over a period of one month. The dead time in frame acquisition resulted in errors in the EPID that increased with leaf speed and were over 20% for a 1 cm leaf gap moving at 1.0 cm s-1. The EPID measurements were also found to depend on the input beam profile utilized for EPID flood-field calibration. The EPID shows promise as a device for verification of IMRT, the major limitation currently being due to dead-time in frame acquisition

  1. Computer-aided classification of lesions by means of their kinetic signatures in dynamic contrast-enhanced MR images

    Science.gov (United States)

    Twellmann, Thorsten; ter Haar Romeny, Bart

    2008-03-01

    The kinetic characteristics of tissue in dynamic contrast-enhanced magnetic resonance imaging data are an important source of information for the differentiation of benign and malignant lesions. Kinetic curves measured for each lesion voxel allow to infer information about the state of the local tissue. As a whole, they reflect the heterogeneity of the vascular structure within a lesion, an important criterion for the preoperative classification of lesions. Current clinical practice in analysis of tissue kinetics however is mainly based on the evaluation of the "most-suspect curve", which is only related to a small, manually or semi-automatically selected region-of-interest within a lesion and does not reflect any information about tissue heterogeneity. We propose a new method which exploits the full range of kinetic information for the automatic classification of lesions. Instead of breaking down the large amount of kinetic information to a single curve, each lesion is considered as a probability distribution in a space of kinetic features, efficiently represented by its kinetic signature obtained by adaptive vector quantization of the corresponding kinetic curves. Dissimilarity of two signatures can be objectively measured using the Mallows distance, which is a metric defined on probability distributions. The embedding of this metric in a suitable kernel function enables us to employ modern kernel-based machine learning techniques for the classification of signatures. In a study considering 81 breast lesions, the proposed method yielded an A z value of 0.89+/-0.01 for the discrimination of benign and malignant lesions in a nested leave-one-lesion-out evaluation setting.

  2. Dynamic response signatures of a scaled model platform for floating wind turbines in an ocean wave basin.

    Science.gov (United States)

    Jaksic, V; O'Shea, R; Cahill, P; Murphy, J; Mandic, D P; Pakrashi, V

    2015-02-28

    Understanding of dynamic behaviour of offshore wind floating substructures is extremely important in relation to design, operation, maintenance and management of floating wind farms. This paper presents assessment of nonlinear signatures of dynamic responses of a scaled tension-leg platform (TLP) in a wave tank exposed to different regular wave conditions and sea states characterized by the Bretschneider, the Pierson-Moskowitz and the JONSWAP spectra. Dynamic responses of the TLP were monitored at different locations using load cells, a camera-based motion recognition system and a laser Doppler vibrometer. The analysis of variability of the TLP responses and statistical quantification of their linearity or nonlinearity, as non-destructive means of structural monitoring from the output-only condition, remains a challenging problem. In this study, the delay vector variance (DVV) method is used to statistically study the degree of nonlinearity of measured response signals from a TLP. DVV is observed to create a marker estimating the degree to which a change in signal nonlinearity reflects real-time behaviour of the structure and also to establish the sensitivity of the instruments employed to these changes. The findings can be helpful in establishing monitoring strategies and control strategies for undesirable levels or types of dynamic response and can help to better estimate changes in system characteristics over the life cycle of the structure. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  3. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  4. EG-07CELL CYCLE SIGNATURE AND TUMOR PHYLOGENY ARE ENCODED IN THE EVOLUTIONARY DYNAMICS OF DNA METHYLATION IN GLIOMA

    Science.gov (United States)

    Mazor, Tali; Pankov, Aleksandr; Johnson, Brett E.; Hong, Chibo; Bell, Robert J.A.; Smirnov, Ivan V.; Reis, Gerald F.; Phillips, Joanna J.; Barnes, Michael; Bollen, Andrew W.; Taylor, Barry S.; Molinaro, Annette M.; Olshen, Adam B.; Song, Jun S.; Berger, Mitchel S.; Chang, Susan M.; Costello, Joseph F.

    2014-01-01

    The clonal evolution of tumor cell populations can be reconstructed from patterns of genetic alterations. In contrast, tumor epigenetic states, including DNA methylation, are reversible and sensitive to the tumor microenvironment, presumably precluding the use of epigenetics to discover tumor phylogeny. Here we examined the spatial and temporal dynamics of DNA methylation in a clinically and genetically characterized cohort of IDH1-mutant low-grade gliomas and their patient-matched recurrences. WHO grade II gliomas are diffuse, infiltrative tumors that frequently recur and may undergo malignant progression to a higher grade with a worse prognosis. The extent to which epigenetic alterations contribute to the evolution of low-grade gliomas, including malignant progression, is unknown. While all gliomas in the cohort exhibited the hypermethylation signature associated with IDH1 mutation, low-grade gliomas that underwent malignant progression to high-grade glioblastoma (GBM) had a unique signature of DNA hypomethylation enriched for active enhancers, as well as sites of age-related hypermethylation in the brain. Genes with promoter hypomethylation and concordant transcriptional upregulation during evolution to GBM were enriched in cell cycle function, evolving in concert with genetic alterations that deregulate the G1/S cell cycle checkpoint. Despite the plasticity of tumor epigenetic states, phyloepigenetic trees robustly recapitulated phylogenetic trees derived from somatic mutations in the same patients. These findings highlight widespread co-dependency of genetic and epigenetic events throughout the clonal evolution of initial and recurrent glioma.

  5. Potential energy landscape signatures of slow dynamics in glass forming liquids

    DEFF Research Database (Denmark)

    Sastry, S.; Debenedetti, P. G.; Stillinger, F. H.

    1999-01-01

    We study the properties of local potential energy minima (‘inherent structures’) sampled by liquids at low temperatures as an approach to elucidating the mechanisms of the observed dynamical slowing down observed as the glass transition temperature is approached. This onset of slow dynamics...

  6. STUDIES OF ACOUSTIC EMISSION SIGNATURES FOR QUALITY ASSURANCE OF SS 316L WELDED SAMPLES UNDER DYNAMIC LOAD CONDITIONS

    Directory of Open Access Journals (Sweden)

    S. V. RANGANAYAKULU

    2016-10-01

    Full Text Available Acoustic Emission (AE signatures of various weld defects of stainless steel 316L nuclear grade weld material are investigated. The samples are fabricated by Tungsten Inert Gas (TIG Welding Method have final dimension of 140 mm x 15 mm x 10 mm. AE signals from weld defects such as Pinhole, Porosity, Lack of Penetration, Lack of Side Fusion and Slag are recorded under dynamic load conditions by specially designed mechanical jig. AE features of the weld defects were attained using Linear Location Technique (LLT. The results from this study concluded that, stress release and structure deformation between the sections in welding area are load conditions major part of Acoustic Emission activity during loading.

  7. Signatures of correlated excitonic dynamics in two-dimensional spectroscopy of the Fenna-Matthew-Olson photosynthetic complex

    International Nuclear Information System (INIS)

    Caram, Justin R.; Lewis, Nicholas H. C.; Fidler, Andrew F.; Engel, Gregory S.

    2012-01-01

    Long-lived excitonic coherence in photosynthetic proteins has become an exciting area of research because it may provide design principles for enhancing the efficiency of energy transfer in a broad range of materials. In this publication, we provide new evidence that long-lived excitonic coherence in the Fenna-Mathew-Olson pigment-protein (FMO) complex is consistent with the assumption of cross correlation in the site basis, indicating that each site shares bath fluctuations. We analyze the structure and character of the beating crosspeak between the two lowest energy excitons in two-dimensional (2D) electronic spectra of the FMO Complex. To isolate this dynamic signature, we use the two-dimensional linear prediction Z-transform as a platform for filtering coherent beating signatures within 2D spectra. By separating signals into components in frequency and decay rate representations, we are able to improve resolution and isolate specific coherences. This strategy permits analysis of the shape, position, character, and phase of these features. Simulations of the crosspeak between excitons 1 and 2 in FMO under different regimes of cross correlation verify that statistically independent site fluctuations do not account for the elongation and persistence of the dynamic crosspeak. To reproduce the experimental results, we invoke near complete correlation in the fluctuations experienced by the sites associated with excitons 1 and 2. This model contradicts ab initio quantum mechanic/molecular mechanics simulations that observe no correlation between the energies of individual sites. This contradiction suggests that a new physical model for long-lived coherence may be necessary. The data presented here details experimental results that must be reproduced for a physical model of quantum coherence in photosynthetic energy transfer.

  8. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Andersen, Morten Thøtt; Robertson, Amy N.

    2016-01-01

    The open-source, aero-hydro-servo-elastic wind turbine simulation software FAST v8 (created by the National Renewable Energy Laboratory) was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed b...

  9. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  10. Signatures of chaos and non-integrability in two-dimensional gravity with dynamical boundary

    Directory of Open Access Journals (Sweden)

    Fitkevich Maxim

    2016-01-01

    Full Text Available We propose a model of two-dimensional dilaton gravity with a boundary. In the bulk our model coincides with the classically integrable CGHS model; the dynamical boundary cuts of the CGHS strong-coupling region. As a result, classical dynamics in our model reminds that in the spherically-symmetric gravity: wave packets of matter fields either reflect from the boundary or form black holes. We find large integrable sector of multisoliton solutions in this model. At the same time, we argue that the model is globally non-integrable because solutions at the verge of black hole formation display chaotic properties.

  11. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  12. Host-pathogen evolutionary signatures reveal dynamics and future invasions of vampire bat rabies

    Czech Academy of Sciences Publication Activity Database

    Streicker, D. G.; Winternitz, Jamie Caroline; Satterfield, D. A.; Condori-Condori, R. E.; Broos, A.; Tello, C.; Recuenco, S.; Velasco-Villa, A.; Altizer, S.; Valderrama, W.

    2016-01-01

    Roč. 113, č. 39 (2016), s. 10926-10931 ISSN 0027-8424 Institutional support: RVO:68081766 Keywords : Desmodus * zoonotic disease * forecasting * sex bias * spatial dynamics Subject RIV: GJ - Animal Vermins ; Diseases, Veterinary Medicine Impact factor: 9.661, year: 2016

  13. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    Science.gov (United States)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  14. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian; Robertson, Amy; Jonkman, Jason; Andersen, Morten T.

    2016-08-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by the commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.

  15. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  16. Increasing the Robustness of Biometric Templates for Dynamic Signature Biometric Systems

    OpenAIRE

    Tolosana Moranchel, Rubén; Vera-Rodríguez, Rubén; Ortega-García, Javier; Fiérrez, Julián

    2015-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. R. Tolosana, R. Vera-Rodriguez, J. Ortega-Garcia and J. Fierrez, "Increasing the robustness of biometric templates for dynamic...

  17. RVB signatures in the spin dynamics of the square-lattice Heisenberg antiferromagnet

    Science.gov (United States)

    Ghioldi, E. A.; Gonzalez, M. G.; Manuel, L. O.; Trumper, A. E.

    2016-03-01

    We investigate the spin dynamics of the square-lattice spin-\\frac{1}{2} Heisenberg antiferromagnet by means of an improved mean-field Schwinger boson calculation. By identifying both, the long-range Néel and the RVB-like components of the ground state, we propose an educated guess for the mean-field magnetic excitation consisting on a linear combination of local and bond spin flips to compute the dynamical structure factor. Our main result is that when this magnetic excitation is optimized in such a way that the corresponding sum rule is fulfilled, we recover the low- and high-energy spectral weight features of the experimental spectrum. In particular, the anomalous spectral weight depletion at (π,0) found in recent inelastic neutron scattering experiments can be attributed to the interference of the triplet bond excitations of the RVB component of the ground state. We conclude that the Schwinger boson theory seems to be a good candidate to adequately interpret the dynamic properties of the square-lattice Heisenberg antiferromagnet.

  18. Phytoestrogens and Mycoestrogens Induce Signature Structure Dynamics Changes on Estrogen Receptor α

    Directory of Open Access Journals (Sweden)

    Xueyan Chen

    2016-08-01

    Full Text Available Endocrine disrupters include a broad spectrum of chemicals such as industrial chemicals, natural estrogens and androgens, synthetic estrogens and androgens. Phytoestrogens are widely present in diet and food supplements; mycoestrogens are frequently found in grains. As human beings and animals are commonly exposed to phytoestrogens and mycoestrogens in diet and environment, it is important to understand the potential beneficial or hazardous effects of estrogenic compounds. Many bioassays have been established to study the binding of estrogenic compounds with estrogen receptor (ER and provided rich data in the literature. However, limited assays can offer structure information with regard to the ligand/ER complex. Our current study surveys the global structure dynamics changes for ERα ligand binding domain (LBD when phytoestrogens and mycoestrogens bind. The assay is based on the structure dynamics information probed by hydrogen deuterium exchange mass spectrometry and offers a unique viewpoint to elucidate the mechanism how phytoestrogens and mycoestrogens interact with estrogen receptor. The cluster analysis based on the hydrogen deuterium exchange (HDX assay data reveals a unique pattern when phytoestrogens and mycoestrogens bind with ERα LBD compared to that of estradiol and synthetic estrogen modulators. Our study highlights that structure dynamics could play an important role in the structure function relationship when endocrine disrupters interact with estrogen receptors.

  19. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    Science.gov (United States)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; Lane, J. Matthew D.

    2018-05-01

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressure gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. These simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.

  20. A Rational Threshold Signature Model and Protocol Based on Different Permissions

    Directory of Open Access Journals (Sweden)

    Bojun Wang

    2014-01-01

    Full Text Available This paper develops a novel model and protocol used in some specific scenarios, in which the participants of multiple groups with different permissions can finish the signature together. We apply the secret sharing scheme based on difference equation to the private key distribution phase and secret reconstruction phrase of our threshold signature scheme. In addition, our scheme can achieve the signature success because of the punishment strategy of the repeated rational secret sharing. Besides, the bit commitment and verification method used to detect players’ cheating behavior acts as a contributing factor to prevent the internal fraud. Using bit commitments, verifiable parameters, and time sequences, this paper constructs a dynamic game model, which has the features of threshold signature management with different permissions, cheat proof, and forward security.

  1. Study on dynamic rod worth measurement method and its test verification

    International Nuclear Information System (INIS)

    Wu Lei; Liu Tongxian; Zhao Wenbo; Li Songling; Yu Yingrui

    2015-01-01

    An advanced rod worth measurement technique, the dynamic rod worth measurement method (DRWM) has been developed. Static Spatial Factors (SSF) and Dynamic Spatial Factor (DSF) were introduced to improve the inverse kinetics method. The three dimensional steady and transient simulations for the measurement process was carried out to calculate the modification factors. The rod worth measurement, test was performed on a research reactor to verify DRWM. The results showed that the DRWM method provided the improved accuracy and could be a replacement of the traditional methods. (authors)

  2. Experimental Verification of Dynamic Operation of Continuous and Multivessel Batch Distillation

    Energy Technology Data Exchange (ETDEWEB)

    Wittgens, Bernd

    1999-07-01

    This thesis presents a rigorous model based on first principles for dynamic simulation of the composition dynamics of a staged high-purity continuous distillation columns and experiments performed to verify it. The thesis also demonstrates the importance of tray hydraulics to obtain good agreement between simulation and experiment and derives analytic expressions for dynamic time constants for use in simplified and vapour dynamics. A newly developed multivessel batch distillation column consisting of a reboiler, intermediate vessels and a condenser vessel provides a generalization of previously proposed batch distillation schemes. The total reflux operation of this column was presented previously and the present thesis proposes a simple feedback control strategy for its operation based on temperature measurements. The feasibility of this strategy is demonstrated by simulations and verified by laboratory experiments. It is concluded that the multivessel column can be easily operated with simple temperature controllers, where the holdups are only controlled indirectly. For a given set of temperature setpoints, the final product compositions are independent of the initial feed composition. When the multivessel batch distillation column is compared to a conventional batch column, both operated under feedback control, it is found that the energy required to separate a multicomponent mixture into highly pure products is much less for the multivessel system. This system is also the simplest one to operate.

  3. Understanding Biases in Ribosome Profiling Experiments Reveals Signatures of Translation Dynamics in Yeast.

    Directory of Open Access Journals (Sweden)

    Jeffrey A Hussmann

    2015-12-01

    Full Text Available Ribosome profiling produces snapshots of the locations of actively translating ribosomes on messenger RNAs. These snapshots can be used to make inferences about translation dynamics. Recent ribosome profiling studies in yeast, however, have reached contradictory conclusions regarding the average translation rate of each codon. Some experiments have used cycloheximide (CHX to stabilize ribosomes before measuring their positions, and these studies all counterintuitively report a weak negative correlation between the translation rate of a codon and the abundance of its cognate tRNA. In contrast, some experiments performed without CHX report strong positive correlations. To explain this contradiction, we identify unexpected patterns in ribosome density downstream of each type of codon in experiments that use CHX. These patterns are evidence that elongation continues to occur in the presence of CHX but with dramatically altered codon-specific elongation rates. The measured positions of ribosomes in these experiments therefore do not reflect the amounts of time ribosomes spend at each position in vivo. These results suggest that conclusions from experiments in yeast using CHX may need reexamination. In particular, we show that in all such experiments, codons decoded by less abundant tRNAs were in fact being translated more slowly before the addition of CHX disrupted these dynamics.

  4. Signatures of dynamics in charge transport through organic molecules; Dynamisches Verhalten beim Ladungstransport durch organische Molekuele

    Energy Technology Data Exchange (ETDEWEB)

    Secker, Daniel

    2008-06-03

    The aim of the thesis at hand was to investigate dynamical behaviour in charge transport through organic molecules experimentally with the help of the mechanically controlled break junction (MCBJ) technique. the thesis concentrates on the complex interaction between the molecular contact configuration and the electronic structure. it is shown that by variation of the electrode distance and so by a manipulation of the molecule and contact configuration the electronic structure as well as the coupling between the molecule and the electrodes is affected. The latter statement is an additional hint how closely I-V-characteristics depend on the molecular contact configuration. Depending on the applied voltage and so the electric field there are two different configurations preferred by the molecular contact. A potential barrier between these two states is the origin of the hysteresis. A central part of the thesis is dealing with measurements of the current noise. Finally it can be concluded that the detailed discussion reveals the strong effect of dynamical interactions between the atomic configuration of the molecular contact and the electronic structure on the charge transport in single molecule junctions. (orig.)

  5. On the dynamics of a plasma vortex street and its topological signatures

    International Nuclear Information System (INIS)

    Siregar, E.; Stribling, W.T.; Goldstein, M.L.

    1994-01-01

    A plasma vortex street configuration can evolve when two velocity and one magnetic shear layer interact strongly. A study of the interaction between two- and three-dimensional plasma modes and a mean sheared magnetic field is undertaken using a three-dimensional magnetohydrodynamic spectral Galerkin computation. The initial state is a simple magnetic shear in a plane perpendicular to the plasma velocity shear plane. In a very weak magnetic field, secondary instabilities (three-dimensional modes), expressed by the kinking of vortex tubes, lead to plasma flow along and around the axes of the vortex cores, creating characteristic patterns of kinetic helicity and linkages between vortex filaments. Three-dimensionality leads to the vortex breakdown process. A strong sheared magnetic field inhibits the kinking of vortex tubes, maintaining two-dimensionality. This inhibits vortex breakdown over long dynamical times. There is an anticorrelation in time between linkage indices of the vortex filament (related to kinetic helicity), suggesting that the ellipticity axes of the vortex cores along the street undergo a global inphase evolution. This anticorrelation has a dynamical interpretation. It extends to a relaxing plasma in the Navier--Stokes flow notion that helical regions of opposite helicities interact and screen each other off so that the global helicity remains bounded

  6. Signature Balancing

    NARCIS (Netherlands)

    Noordkamp, H.W.; Brink, M. van den

    2006-01-01

    Signatures are an important part of the design of a ship. In an ideal situation, signatures must be as low as possible. However, due to budget constraints it is most unlikely to reach this ideal situation. The arising question is which levels of signatures are optimal given the different scenarios

  7. Observation of Spectral Signatures of 1/f Dynamics in Avalanches on Granular Piles

    Science.gov (United States)

    Kim, Yong W.; Nishino, Thomas K.

    1997-03-01

    Granular piles of monodisperse glass spheres, 0.46+0.03 mm in diameter, have been studied. The base diameter of the pile has been varied from 3/8" to 2" in 1/8" increments. A single-grain dispenser with greater than 95consisting of a stepping motor-actuated reciprocating arm with a single-grain scoop. Each grain is dropped on the apex of the pile with lowest possible landing velocity at intervals at least 30longer than the duration of largest avalanches for each given pile. Each grain being added and being lost in avalanches from the pile is optically detected and recorded. The power spectrum of the net addition of grains to the pile as a function of time is found to be robustly 1/f for all base sizes. A wide variety of dynamical properties of 1/f systems, as obtained from the high precision data, will be presented.

  8. Chemo-dynamical signatures in simulated Milky Way-like galaxies

    Science.gov (United States)

    Spagna, Alessandro; Curir, Anna; Giammaria, Marco; Lattanzi, Mario G.; Murante, Giuseppe; Re Fiorentin, Paola

    2018-04-01

    We have investigated the chemo-dynamical evolution of a Milky Way-like disk galaxy, AqC4, produced by a cosmological simulation integrating a sub-resolution ISM model. We evidence a global inside-out and upside-down disk evolution, that is consistent with a scenario where the ``thin disk'' stars are formed from the accreted gas close to the galactic plane, while the older ``thick disk'' stars are originated in situ at higher heights. Also, the bar appears the most effective heating mechanism in the inner disk. Finally, no significant metallicity-rotation correlation has been observed, in spite of the presence of a negative [Fe/H] radial gradient.

  9. Vibrational signatures of cation-anion hydrogen bonding in ionic liquids: a periodic density functional theory and molecular dynamics study.

    Science.gov (United States)

    Mondal, Anirban; Balasubramanian, Sundaram

    2015-02-05

    Hydrogen bonding in alkylammonium based protic ionic liquids was studied using density functional theory (DFT) and ab initio molecular dynamics (AIMD) simulations. Normal-mode analysis within the harmonic approximation and power spectra of velocity autocorrelation functions were used as tools to obtain the vibrational spectra in both the gas phase and the crystalline phases of these protic ionic liquids. The hydrogen bond vibrational modes were identified in the 150-240 cm(-1) region of the far-infrared (far-IR) spectra. A blue shift in the far-IR mode was observed with an increasing number of hydrogen-bonding sites on the cation; the exact peak position is modulated by the cation-anion hydrogen bond strength. Sub-100 cm(-1) bands in the far-IR spectrum are assigned to the rattling motion of the anions. Calculated NMR chemical shifts of the acidic protons in the crystalline phase of these salts also exhibit the signature of cation-anion hydrogen bonding.

  10. Interplay of community dynamics, temperature, and productivity on the hydrogen isotope signatures of lipid biomarkers

    Directory of Open Access Journals (Sweden)

    S. N. Ladd

    2017-09-01

    Full Text Available The hydrogen isotopic composition (δ2H of lipid biomarkers has diverse applications in the fields of paleoclimatology, biogeochemistry, and microbial community dynamics. Large changes in hydrogen isotope fractionation have been observed among microbes with differing core metabolisms, while environmental factors including temperature and nutrient availability can affect isotope fractionation by photoautotrophs. Much effort has gone into studying these effects under laboratory conditions with single species cultures. Moving beyond controlled environments and quantifying the natural extent of these changes in freshwater lacustrine settings and identifying their causes is essential for robust application of δ2H values of common short-chain fatty acids as a proxy of net community metabolism and of phytoplankton-specific biomarkers as a paleohydrologic proxy. This work targets the effect of community dynamics, temperature, and productivity on 2H∕1H fractionation in lipid biomarkers through a comparative time series in two central Swiss lakes: eutrophic Lake Greifen and oligotrophic Lake Lucerne. Particulate organic matter was collected from surface waters at six time points throughout the spring and summer of 2015, and δ2H values of short-chain fatty acids, as well as chlorophyll-derived phytol and the diatom biomarker brassicasterol, were measured. We paired these measurements with in situ incubations conducted with NaH13CO3, which were used to calculate the production rates of individual lipids in lake surface water. As algal productivity increased from April to June, net discrimination against 2H in Lake Greifen increased by as much as 148 ‰ for individual fatty acids. During the same time period in Lake Lucerne, net discrimination against 2H increased by as much as 58 ‰ for individual fatty acids. A large portion of this signal is likely due to a greater proportion of heterotrophically derived fatty acids in the winter and early

  11. Diversity of sharp-wave–ripple LFP signatures reveals differentiated brain-wide dynamical events

    Science.gov (United States)

    Ramirez-Villegas, Juan F.; Logothetis, Nikos K.; Besserve, Michel

    2015-01-01

    Sharp-wave–ripple (SPW-R) complexes are believed to mediate memory reactivation, transfer, and consolidation. However, their underlying neuronal dynamics at multiple scales remains poorly understood. Using concurrent hippocampal local field potential (LFP) recordings and functional MRI (fMRI), we study local changes in neuronal activity during SPW-R episodes and their brain-wide correlates. Analysis of the temporal alignment between SPW and ripple components reveals well-differentiated SPW-R subtypes in the CA1 LFP. SPW-R–triggered fMRI maps show that ripples aligned to the positive peak of their SPWs have enhanced neocortical metabolic up-regulation. In contrast, ripples occurring at the trough of their SPWs relate to weaker neocortical up-regulation and absent subcortical down-regulation, indicating differentiated involvement of neuromodulatory pathways in the ripple phenomenon mediated by long-range interactions. To our knowledge, this study provides the first evidence for the existence of SPW-R subtypes with differentiated CA1 activity and metabolic correlates in related brain areas, possibly serving different memory functions. PMID:26540729

  12. Diversity of sharp-wave-ripple LFP signatures reveals differentiated brain-wide dynamical events.

    Science.gov (United States)

    Ramirez-Villegas, Juan F; Logothetis, Nikos K; Besserve, Michel

    2015-11-17

    Sharp-wave-ripple (SPW-R) complexes are believed to mediate memory reactivation, transfer, and consolidation. However, their underlying neuronal dynamics at multiple scales remains poorly understood. Using concurrent hippocampal local field potential (LFP) recordings and functional MRI (fMRI), we study local changes in neuronal activity during SPW-R episodes and their brain-wide correlates. Analysis of the temporal alignment between SPW and ripple components reveals well-differentiated SPW-R subtypes in the CA1 LFP. SPW-R-triggered fMRI maps show that ripples aligned to the positive peak of their SPWs have enhanced neocortical metabolic up-regulation. In contrast, ripples occurring at the trough of their SPWs relate to weaker neocortical up-regulation and absent subcortical down-regulation, indicating differentiated involvement of neuromodulatory pathways in the ripple phenomenon mediated by long-range interactions. To our knowledge, this study provides the first evidence for the existence of SPW-R subtypes with differentiated CA1 activity and metabolic correlates in related brain areas, possibly serving different memory functions.

  13. Optimal placement of excitations and sensors for verification of large dynamical systems

    Science.gov (United States)

    Salama, M.; Rose, T.; Garba, J.

    1987-01-01

    The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

  14. Disparity changes in 370 Ma Devonian fossils: the signature of ecological dynamics?

    Science.gov (United States)

    Girard, Catherine; Renaud, Sabrina

    2012-01-01

    Early periods in Earth's history have seen a progressive increase in complexity of the ecosystems, but also dramatic crises decimating the biosphere. Such patterns are usually considered as large-scale changes among supra-specific groups, including morphological novelties, radiation, and extinctions. Nevertheless, in the same time, each species evolved by the way of micro-evolutionary processes, extended over millions of years into the evolution of lineages. How these two evolutionary scales interacted is a challenging issue because this requires bridging a gap between scales of observation and processes. The present study aims at transferring a typical macro-evolutionary approach, namely disparity analysis, to the study of fine-scale evolutionary variations in order to decipher what processes actually drove the dynamics of diversity at a micro-evolutionary level. The Late Frasnian to Late Famennian period was selected because it is punctuated by two major macro-evolutionary crises, as well as a progressive diversification of marine ecosystem. Disparity was estimated through this period on conodonts, tooth-like fossil remains of small eel-like predators that were part of the nektonic fauna. The study was focused on the emblematic genus of the period, Palmatolepis. Strikingly, both crises affected an already impoverished Palmatolepis disparity, increasing risks of random extinction. The major disparity signal rather emerged as a cycle of increase and decrease in disparity during the inter-crises period. The diversification shortly followed the first crisis and might correspond to an opportunistic occupation of empty ecological niche. The subsequent oriented shrinking in the morphospace occupation suggests that the ecological space available to Palmatolepis decreased through time, due to a combination of factors: deteriorating climate, expansion of competitors and predators. Disparity changes of Palmatolepis thus reflect changes in the structure of the ecological

  15. Disparity changes in 370 Ma Devonian fossils: the signature of ecological dynamics?

    Directory of Open Access Journals (Sweden)

    Catherine Girard

    Full Text Available Early periods in Earth's history have seen a progressive increase in complexity of the ecosystems, but also dramatic crises decimating the biosphere. Such patterns are usually considered as large-scale changes among supra-specific groups, including morphological novelties, radiation, and extinctions. Nevertheless, in the same time, each species evolved by the way of micro-evolutionary processes, extended over millions of years into the evolution of lineages. How these two evolutionary scales interacted is a challenging issue because this requires bridging a gap between scales of observation and processes. The present study aims at transferring a typical macro-evolutionary approach, namely disparity analysis, to the study of fine-scale evolutionary variations in order to decipher what processes actually drove the dynamics of diversity at a micro-evolutionary level. The Late Frasnian to Late Famennian period was selected because it is punctuated by two major macro-evolutionary crises, as well as a progressive diversification of marine ecosystem. Disparity was estimated through this period on conodonts, tooth-like fossil remains of small eel-like predators that were part of the nektonic fauna. The study was focused on the emblematic genus of the period, Palmatolepis. Strikingly, both crises affected an already impoverished Palmatolepis disparity, increasing risks of random extinction. The major disparity signal rather emerged as a cycle of increase and decrease in disparity during the inter-crises period. The diversification shortly followed the first crisis and might correspond to an opportunistic occupation of empty ecological niche. The subsequent oriented shrinking in the morphospace occupation suggests that the ecological space available to Palmatolepis decreased through time, due to a combination of factors: deteriorating climate, expansion of competitors and predators. Disparity changes of Palmatolepis thus reflect changes in the structure

  16. Dynamic modeling and verification of an energy-efficient greenhouse with an aquaponic system using TRNSYS

    Science.gov (United States)

    Amin, Majdi Talal

    Currently, there is no integrated dynamic simulation program for an energy efficient greenhouse coupled with an aquaponic system. This research is intended to promote the thermal management of greenhouses in order to provide sustainable food production with the lowest possible energy use and material waste. A brief introduction of greenhouses, passive houses, energy efficiency, renewable energy systems, and their applications are included for ready reference. An experimental working scaled-down energy-efficient greenhouse was built to verify and calibrate the results of a dynamic simulation model made using TRNSYS software. However, TRNSYS requires the aid of Google SketchUp to develop 3D building geometry. The simulation model was built following the passive house standard as closely as possible. The new simulation model was then utilized to design an actual greenhouse with Aquaponics. It was demonstrated that the passive house standard can be applied to improve upon conventional greenhouse performance, and that it is adaptable to different climates. The energy-efficient greenhouse provides the required thermal environment for fish and plant growth, while eliminating the need for conventional cooling and heating systems.

  17. Thermal dynamic behavior during selective laser melting of K418 superalloy: numerical simulation and experimental verification

    Science.gov (United States)

    Chen, Zhen; Xiang, Yu; Wei, Zhengying; Wei, Pei; Lu, Bingheng; Zhang, Lijuan; Du, Jun

    2018-04-01

    During selective laser melting (SLM) of K418 powder, the influence of the process parameters, such as laser power P and scanning speed v, on the dynamic thermal behavior and morphology of the melted tracks was investigated numerically. A 3D finite difference method was established to predict the dynamic thermal behavior and flow mechanism of K418 powder irradiated by a Gaussian laser beam. A three-dimensional randomly packed powder bed composed of spherical particles was established by discrete element method. The powder particle information including particle size distribution and packing density were taken into account. The volume shrinkage and temperature-dependent thermophysical parameters such as thermal conductivity, specific heat, and other physical properties were also considered. The volume of fluid method was applied to reconstruct the free surface of the molten pool during SLM. The geometrical features, continuity boundaries, and irregularities of the molten pool were proved to be largely determined by the laser energy density. The numerical results are in good agreement with the experiments, which prove to be reasonable and effective. The results provide us some in-depth insight into the complex physical behavior during SLM and guide the optimization of process parameters.

  18. Organic matter dynamics and stable isotope signature as tracers of the sources of suspended sediment

    Directory of Open Access Journals (Sweden)

    Y. Schindler Wildhaber

    2012-06-01

    Full Text Available Suspended sediment (SS and organic matter in rivers can harm brown trout Salmo trutta by affecting the health and fitness of free swimming fish and by causing siltation of the riverbed. The temporal and spatial dynamics of sediment, carbon (C, and nitrogen (N during the brown trout spawning season in a small river of the Swiss Plateau were assessed and C isotopes as well as the C/N atomic ratio were used to distinguish autochthonous and allochthonous sources of organic matter in SS loads. The visual basic program IsoSource with 13Ctot and 15N as input isotopes was used to quantify the temporal and spatial sources of SS. Organic matter concentrations in the infiltrated and suspended sediment were highest during low flow periods with small sediment loads and lowest during high flow periods with high sediment loads. Peak values in nitrate and dissolved organic C were measured during high flow and high rainfall, probably due to leaching from pasture and arable land. The organic matter was of allochthonous sources as indicated by the C/N atomic ratio and δ13Corg. Organic matter in SS increased from up- to downstream due to an increase of pasture and arable land downstream of the river. The mean fraction of SS originating from upper watershed riverbed sediment decreased from up to downstream and increased during high flow at all measuring sites along the course of the river. During base flow conditions, the major sources of SS are pasture, forest and arable land. The latter increased during rainy and warmer winter periods, most likely because both triggered snow melt and thus erosion. The measured increase in DOC and nitrate concentrations during high flow support these modeling results. Enhanced soil erosion processes on pasture and arable land are expected with increasing heavy rain events and less snow during winter seasons due to climate change. Consequently, SS and organic

  19. Parton Theory of Magnetic Polarons: Mesonic Resonances and Signatures in Dynamics

    Science.gov (United States)

    Grusdt, F.; Kánasz-Nagy, M.; Bohrdt, A.; Chiu, C. S.; Ji, G.; Greiner, M.; Greif, D.; Demler, E.

    2018-01-01

    When a mobile hole is moving in an antiferromagnet it distorts the surrounding Néel order and forms a magnetic polaron. Such interplay between hole motion and antiferromagnetism is believed to be at the heart of high-temperature superconductivity in cuprates. In this article, we study a single hole described by the t -Jz model with Ising interactions between the spins in two dimensions. This situation can be experimentally realized in quantum gas microscopes with Mott insulators of Rydberg-dressed bosons or fermions, or using polar molecules. We work at strong couplings, where hole hopping is much larger than couplings between the spins. In this regime we find strong theoretical evidence that magnetic polarons can be understood as bound states of two partons, a spinon and a holon carrying spin and charge quantum numbers, respectively. Starting from first principles, we introduce a microscopic parton description which is benchmarked by comparison with results from advanced numerical simulations. Using this parton theory, we predict a series of excited states that are invisible in the spectral function and correspond to rotational excitations of the spinon-holon pair. This is reminiscent of mesonic resonances observed in high-energy physics, which can be understood as rotating quark-antiquark pairs carrying orbital angular momentum. Moreover, we apply the strong-coupling parton theory to study far-from-equilibrium dynamics of magnetic polarons observable in current experiments with ultracold atoms. Our work supports earlier ideas that partons in a confining phase of matter represent a useful paradigm in condensed-matter physics and in the context of high-temperature superconductivity in particular. While direct observations of spinons and holons in real space are impossible in traditional solid-state experiments, quantum gas microscopes provide a new experimental toolbox. We show that, using this platform, direct observations of partons in and out of equilibrium are

  20. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  1. Dynamic CT myocardial perfusion imaging: detection of ischemia in a porcine model with FFR verification

    Science.gov (United States)

    Fahmi, Rachid; Eck, Brendan L.; Vembar, Mani; Bezerra, Hiram G.; Wilson, David L.

    2014-03-01

    Dynamic cardiac CT perfusion (CTP) is a high resolution, non-invasive technique for assessing myocardial blood ow (MBF), which in concert with coronary CT angiography enable CT to provide a unique, comprehensive, fast analysis of both coronary anatomy and functional ow. We assessed perfusion in a porcine model with and without coronary occlusion. To induce occlusion, each animal underwent left anterior descending (LAD) stent implantation and angioplasty balloon insertion. Normal ow condition was obtained with balloon completely de ated. Partial occlusion was induced by balloon in ation against the stent with FFR used to assess the extent of occlusion. Prospective ECG-triggered partial scan images were acquired at end systole (45% R-R) using a multi-detector CT (MDCT) scanner. Images were reconstructed using FBP and a hybrid iterative reconstruction (iDose4, Philips Healthcare). Processing included: beam hardening (BH) correction, registration of image volumes using 3D cubic B-spline normalized mutual-information, and spatio-temporal bilateral ltering to reduce partial scan artifacts and noise variation. Absolute blood ow was calculated with a deconvolutionbased approach using singular value decomposition (SVD). Arterial input function was estimated from the left ventricle (LV) cavity. Regions of interest (ROIs) were identi ed in healthy and ischemic myocardium and compared in normal and occluded conditions. Under-perfusion was detected in the correct LAD territory and ow reduction agreed well with FFR measurements. Flow was reduced, on average, in LAD territories by 54%.

  2. Revocable identity-based proxy re-signature against signing key exposure.

    Science.gov (United States)

    Yang, Xiaodong; Chen, Chunlin; Ma, Tingchun; Wang, Jinli; Wang, Caifen

    2018-01-01

    Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification.

  3. TU-CD-304-03: Dosimetric Verification and Preliminary Comparison of Dynamic Wave Arc for SBRT Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Burghelea, M [UZ BRUSSEL, Brussels (Belgium); BRAINLAB AG, Munich (Germany); Babes Bolyai University, Cluj-Napoca (Romania); Poels, K; Gevaert, T; Tournel, K; Dhont, J; De Ridder, M; Verellen, D [UZ BRUSSEL, Brussels (Belgium); Hung, C [BRAINLAB AG, Munich (Germany); Eriksson, K [RAYSEARCH LABORATORIES AB, Stockholm (Sweden); Simon, V [Babes Bolyai University, Cluj-Napoca (Romania)

    2015-06-15

    Purpose: To evaluate the potential dosimetric benefits and verify the delivery accuracy of Dynamic Wave Arc, a novel treatment delivery approach for the Vero SBRT system. Methods: Dynamic Wave Arc (DWA) combines simultaneous movement of gantry/ring with inverse planning optimization, resulting in an uninterrupted non-coplanar arc delivery technique. Thirteen SBRT complex cases previously treated with 8–10 conformal static beams (CRT) were evaluated in this study. Eight primary centrally-located NSCLC (prescription dose 4×12Gy or 8×7.5Gy) and five oligometastatic cases (2×2 lesions, 10×5Gy) were selected. DWA and coplanar VMAT plans, partially with dual arcs, were generated for each patient using identical objective functions for target volumes and OARs on the same TPS (RayStation, RaySearch Laboratories). Dosimetric differences and delivery time among these three planning schemes were evaluated. The DWA delivery accuracy was assessed using the Delta4 diode array phantom (ScandiDos AB). The gamma analysis was performed with the 3%/3mm dose and distance-to-agreement criteria. Results: The target conformity for CRT, VMAT and DWA were 0.95±0.07, 0.96±0.04 and 0.97±0.04, while the low dose spillage gradient were 5.52±1.36, 5.44±1.11, and 5.09±0.98 respectively. Overall, the bronchus, esophagus and spinal cord maximum doses were similar between VMAT and DWA, but highly reduced compared with CRT. For the lung cases, the mean dose and V20Gy were lower for the arc techniques compares with CRT, while for the liver cases, the mean dose and the V30Gy presented slightly higher values. The average delivery time of VMAT and DWA were 2.46±1.10 min and 4.25±1.67 min, VMAT presenting shorter treatment time in all cases. The DWA dosimetric verification presented an average gamma index passing rate of 95.73±1.54% (range 94.2%–99.8%). Conclusion: Our preliminary data indicated that the DWA is deliverable with clinically acceptable accuracy and has the potential to

  4. TU-CD-304-03: Dosimetric Verification and Preliminary Comparison of Dynamic Wave Arc for SBRT Treatments

    International Nuclear Information System (INIS)

    Burghelea, M; Poels, K; Gevaert, T; Tournel, K; Dhont, J; De Ridder, M; Verellen, D; Hung, C; Eriksson, K; Simon, V

    2015-01-01

    Purpose: To evaluate the potential dosimetric benefits and verify the delivery accuracy of Dynamic Wave Arc, a novel treatment delivery approach for the Vero SBRT system. Methods: Dynamic Wave Arc (DWA) combines simultaneous movement of gantry/ring with inverse planning optimization, resulting in an uninterrupted non-coplanar arc delivery technique. Thirteen SBRT complex cases previously treated with 8–10 conformal static beams (CRT) were evaluated in this study. Eight primary centrally-located NSCLC (prescription dose 4×12Gy or 8×7.5Gy) and five oligometastatic cases (2×2 lesions, 10×5Gy) were selected. DWA and coplanar VMAT plans, partially with dual arcs, were generated for each patient using identical objective functions for target volumes and OARs on the same TPS (RayStation, RaySearch Laboratories). Dosimetric differences and delivery time among these three planning schemes were evaluated. The DWA delivery accuracy was assessed using the Delta4 diode array phantom (ScandiDos AB). The gamma analysis was performed with the 3%/3mm dose and distance-to-agreement criteria. Results: The target conformity for CRT, VMAT and DWA were 0.95±0.07, 0.96±0.04 and 0.97±0.04, while the low dose spillage gradient were 5.52±1.36, 5.44±1.11, and 5.09±0.98 respectively. Overall, the bronchus, esophagus and spinal cord maximum doses were similar between VMAT and DWA, but highly reduced compared with CRT. For the lung cases, the mean dose and V20Gy were lower for the arc techniques compares with CRT, while for the liver cases, the mean dose and the V30Gy presented slightly higher values. The average delivery time of VMAT and DWA were 2.46±1.10 min and 4.25±1.67 min, VMAT presenting shorter treatment time in all cases. The DWA dosimetric verification presented an average gamma index passing rate of 95.73±1.54% (range 94.2%–99.8%). Conclusion: Our preliminary data indicated that the DWA is deliverable with clinically acceptable accuracy and has the potential to

  5. Verification of the Solar Dynamics Observatory High Gain Antenna Pointing Algorithm Using Flight Data

    Science.gov (United States)

    Bourkland, Kristin L.; Liu, Kuo-Chia

    2011-01-01

    The Solar Dynamics Observatory (SDO) is a NASA spacecraft designed to study the Sun. It was launched on February 11, 2010 into a geosynchronous orbit, and uses a suite of attitude sensors and actuators to finely point the spacecraft at the Sun. SDO has three science instruments: the Atmospheric Imaging Assembly (AIA), the Helioseismic and Magnetic Imager (HMI), and the Extreme Ultraviolet Variability Experiment (EVE). SDO uses two High Gain Antennas (HGAs) to send science data to a dedicated ground station in White Sands, New Mexico. In order to meet the science data capture budget, the HGAs must be able to transmit data to the ground for a very large percentage of the time. Each HGA is a dual-axis antenna driven by stepper motors. Both antennas transmit data at all times, but only a single antenna is required in order to meet the transmission rate requirement. For portions of the year, one antenna or the other has an unobstructed view of the White Sands ground station. During other periods, however, the view from both antennas to the Earth is blocked for different portions of the day. During these times of blockage, the two HGAs take turns pointing to White Sands, with the other antenna pointing out to space. The HGAs handover White Sands transmission responsibilities to the unblocked antenna. There are two handover seasons per year, each lasting about 72 days, where the antennas hand off control every twelve hours. The non-tracking antenna slews back to the ground station by following a ground commanded trajectory and arrives approximately 5 minutes before the formerly tracking antenna slews away to point out into space. The SDO Attitude Control System (ACS) runs at 5 Hz, and the HGA Gimbal Control Electronics (GCE) run at 200 Hz. There are 40 opportunities for the gimbals to step each ACS cycle, with a hardware limitation of no more than one step every three GCE cycles. The ACS calculates the desired gimbal motion for tracking the ground station or for slewing

  6. Un système de vérification de signature manuscrite en ligne basé ...

    African Journals Online (AJOL)

    Administrateur

    systems. The problem of cursive handwritten signatures verification can be approached on two main approaches one probabilistic. (analytical) and another structural. So, two methodologies ... do the classification is presented in [11].The online signature ..... Automatic on-line signature verification based on multiple models ...

  7. Quantum multi-signature protocol based on teleportation

    International Nuclear Information System (INIS)

    Wen Xiao-jun; Liu Yun; Sun Yu

    2007-01-01

    In this paper, a protocol which can be used in multi-user quantum signature is proposed. The scheme of signature and verification is based on the correlation of Greenberger-Horne-Zeilinger (GHZ) states and the controlled quantum teleportation. Different from the digital signatures, which are based on computational complexity, the proposed protocol has perfect security in the noiseless quantum channels. Compared to previous quantum signature schemes, this protocol can verify the signature independent of an arbitrator as well as realize multi-user signature together. (orig.)

  8. Signature-based store checking buffer

    Science.gov (United States)

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  9. The impact of gyre dynamics on the mid-depth salinity signature of the eastern North Atlantic

    Science.gov (United States)

    Burkholder, K. C.; Lozier, M. S.

    2009-04-01

    The Mediterranean Overflow Water (MOW) is widely recognized for its role in establishing the mid-depth salinity signature of the subtropical North Atlantic. However, recent work has revealed an intermittent impact of MOW on the salinity signature of the eastern subpolar basin. This impact results from a temporally variable penetration of the northward flowing branch of the MOW past Porcupine Bank into the eastern subpolar basin. It has been shown that the salinity signature of the eastern subpolar basin, in particular the Rockall Trough, varies with the state of the North Atlantic Oscillation (NAO): during persistent periods of strong winds (high NAO index), when the subpolar front moves eastward, waters in the subpolar gyre block the northward flowing MOW, preventing its entry into the subpolar gyre. Conversely, during persistent periods of weak winds (low NAO index), the front moves westward, allowing MOW to penetrate north of Porcupine Bank and into the subpolar gyre. Here, we investigate the manner in which the spatial and temporal variability in the northward penetration of the MOW and the position of the eastern limb of the subpolar front affect the mid-depth property fields not only in the subpolar gyre, but in the subtropical gyre as well. Using approximately 55 years of historical hydrographic data and output from the 1/12° FLAME model, we analyze the temporal variability of salinity along the eastern boundary and compare this variability to the position of the subpolar front in both the observational record and the FLAME model. We conclude that when the zonal position of the subpolar front moves relatively far offshore and the MOW is able to penetrate to the north, high salinity anomalies are observed at high latitudes and low salinity anomalies are observed at low latitudes. Conversely, when the frontal position shifts to the east, the MOW (and thus, the high salinity signature) is blocked, resulting in a drop in salinity anomalies at high latitudes

  10. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  11. Radiation signatures

    International Nuclear Information System (INIS)

    McGlynn, S.P.; Varma, M.N.

    1992-01-01

    A new concept for modelling radiation risk is proposed. This concept is based on the proposal that the spectrum of molecular lesions, which we dub ''the radiation signature'', can be used to identify the quality of the causal radiation. If the proposal concerning radiation signatures can be established then, in principle, both prospective and retrospective risk determination can be assessed on an individual basis. A major goal of biophysical modelling is to relate physical events such as ionization, excitation, etc. to the production of radiation carcinogenesis. A description of the physical events is provided by track structure. The track structure is determined by radiation quality, and it can be considered to be the ''physical signature'' of the radiation. Unfortunately, the uniqueness characteristics of this signature are dissipated in biological systems in ∼10 -9 s. Nonetheless, it is our contention that this physical disturbance of the biological system eventuates later, at ∼10 0 s, in molecular lesion spectra which also characterize the causal radiation. (author)

  12. Off-fault plasticity in three-dimensional dynamic rupture simulations using a modal Discontinuous Galerkin method on unstructured meshes: Implementation, verification, and application

    Science.gov (United States)

    Wollherr, Stephanie; Gabriel, Alice-Agnes; Uphoff, Carsten

    2018-05-01

    The dynamics and potential size of earthquakes depend crucially on rupture transfers between adjacent fault segments. To accurately describe earthquake source dynamics, numerical models can account for realistic fault geometries and rheologies such as nonlinear inelastic processes off the slip interface. We present implementation, verification, and application of off-fault Drucker-Prager plasticity in the open source software SeisSol (www.seissol.org). SeisSol is based on an arbitrary high-order derivative modal Discontinuous Galerkin (ADER-DG) method using unstructured, tetrahedral meshes specifically suited for complex geometries. Two implementation approaches are detailed, modelling plastic failure either employing sub-elemental quadrature points or switching to nodal basis coefficients. At fine fault discretizations the nodal basis approach is up to 6 times more efficient in terms of computational costs while yielding comparable accuracy. Both methods are verified in community benchmark problems and by three dimensional numerical h- and p-refinement studies with heterogeneous initial stresses. We observe no spectral convergence for on-fault quantities with respect to a given reference solution, but rather discuss a limitation to low-order convergence for heterogeneous 3D dynamic rupture problems. For simulations including plasticity, a high fault resolution may be less crucial than commonly assumed, due to the regularization of peak slip rate and an increase of the minimum cohesive zone width. In large-scale dynamic rupture simulations based on the 1992 Landers earthquake, we observe high rupture complexity including reverse slip, direct branching, and dynamic triggering. The spatio-temporal distribution of rupture transfers are altered distinctively by plastic energy absorption, correlated with locations of geometrical fault complexity. Computational cost increases by 7% when accounting for off-fault plasticity in the demonstrating application. Our results

  13. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  14. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  15. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  16. Signatures of collective electron dynamics in the angular distributions of electrons ejected during ultrashort laser pulse interactions with C+

    International Nuclear Information System (INIS)

    Lysaght, M A; Hutchinson, S; Van der Hart, H W

    2009-01-01

    We use the time-dependent R-matrix approach to investigate an ultrashort pump-probe scheme to observe collective electron dynamics in C + driven by the repulsion of two equivalent p electrons. By studying the two-dimensional momentum distributions of the ejected electron as a function of the time-delay between an ultrashort pump pulse and an ionizing ultrashort probe pulse it is possible to track the collective dynamics inside the C + ion in the time domain.

  17. Spatiotemporal dynamics of the brain at rest--exploring EEG microstates as electrophysiological signatures of BOLD resting state networks.

    Science.gov (United States)

    Yuan, Han; Zotev, Vadim; Phillips, Raquel; Drevets, Wayne C; Bodurka, Jerzy

    2012-05-01

    Neuroimaging research suggests that the resting cerebral physiology is characterized by complex patterns of neuronal activity in widely distributed functional networks. As studied using functional magnetic resonance imaging (fMRI) of the blood-oxygenation-level dependent (BOLD) signal, the resting brain activity is associated with slowly fluctuating hemodynamic signals (~10s). More recently, multimodal functional imaging studies involving simultaneous acquisition of BOLD-fMRI and electroencephalography (EEG) data have suggested that the relatively slow hemodynamic fluctuations of some resting state networks (RSNs) evinced in the BOLD data are related to much faster (~100 ms) transient brain states reflected in EEG signals, that are referred to as "microstates". To further elucidate the relationship between microstates and RSNs, we developed a fully data-driven approach that combines information from simultaneously recorded, high-density EEG and BOLD-fMRI data. Using independent component analysis (ICA) of the combined EEG and fMRI data, we identified thirteen microstates and ten RSNs that are organized independently in their temporal and spatial characteristics, respectively. We hypothesized that the intrinsic brain networks that are active at rest would be reflected in both the EEG data and the fMRI data. To test this hypothesis, the rapid fluctuations associated with each microstate were correlated with the BOLD-fMRI signal associated with each RSN. We found that each RSN was characterized further by a specific electrophysiological signature involving from one to a combination of several microstates. Moreover, by comparing the time course of EEG microstates to that of the whole-brain BOLD signal, on a multi-subject group level, we unraveled for the first time a set of microstate-associated networks that correspond to a range of previously described RSNs, including visual, sensorimotor, auditory, attention, frontal, visceromotor and default mode networks. These

  18. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  19. Design and verification of a simple 3D dynamic model of speed skating which mimics observed forces and motions.

    Science.gov (United States)

    van der Kruk, E; Veeger, H E J; van der Helm, F C T; Schwab, A L

    2017-11-07

    Advice about the optimal coordination pattern for an individual speed skater, could be addressed by simulation and optimization of a biomechanical speed skating model. But before getting to this optimization approach one needs a model that can reasonably match observed behaviour. Therefore, the objective of this study is to present a verified three dimensional inverse skater model with minimal complexity, which models the speed skating motion on the straights. The model simulates the upper body transverse translation of the skater together with the forces exerted by the skates on the ice. The input of the model is the changing distance between the upper body and the skate, referred to as the leg extension (Euclidean distance in 3D space). Verification shows that the model mimics the observed forces and motions well. The model is most accurate for the position and velocity estimation (respectively 1.2% and 2.9% maximum residuals) and least accurate for the force estimations (underestimation of 4.5-10%). The model can be used to further investigate variables in the skating motion. For this, the input of the model, the leg extension, can be optimized to obtain a maximal forward velocity of the upper body. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. Quantum signature scheme for known quantum messages

    International Nuclear Information System (INIS)

    Kim, Taewan; Lee, Hyang-Sook

    2015-01-01

    When we want to sign a quantum message that we create, we can use arbitrated quantum signature schemes which are possible to sign for not only known quantum messages but also unknown quantum messages. However, since the arbitrated quantum signature schemes need the help of a trusted arbitrator in each verification of the signature, it is known that the schemes are not convenient in practical use. If we consider only known quantum messages such as the above situation, there can exist a quantum signature scheme with more efficient structure. In this paper, we present a new quantum signature scheme for known quantum messages without the help of an arbitrator. Differing from arbitrated quantum signature schemes based on the quantum one-time pad with the symmetric key, since our scheme is based on quantum public-key cryptosystems, the validity of the signature can be verified by a receiver without the help of an arbitrator. Moreover, we show that our scheme provides the functions of quantum message integrity, user authentication and non-repudiation of the origin as in digital signature schemes. (paper)

  1. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  2. Signatures of a quantum dynamical phase transition in a three-spin system in presence of a spin environment

    International Nuclear Information System (INIS)

    Alvarez, Gonzalo A.; Levstein, Patricia R.; Pastawski, Horacio M.

    2007-01-01

    We have observed an environmentally induced quantum dynamical phase transition in the dynamics of a two-spin experimental swapping gate [G.A. Alvarez, E.P. Danieli, P.R. Levstein, H.M. Pastawski, J. Chem. Phys. 124 (2006) 194507]. There, the exchange of the coupled states vertical bar ↑,↓> and vertical bar ↓,↑> gives an oscillation with a Rabi frequency b/ℎ (the spin-spin coupling). The interaction, ℎ/τ SE with a spin-bath degrades the oscillation with a characteristic decoherence time. We showed that the swapping regime is restricted only to bτ SE > or approx. ℎ. However, beyond a critical interaction with the environment the swapping freezes and the system enters to a Quantum Zeno dynamical phase where relaxation decreases as coupling with the environment increases. Here, we solve the quantum dynamics of a two-spin system coupled to a spin-bath within a Liouville-von Neumann quantum master equation and we compare the results with our previous work within the Keldysh formalism. Then, we extend the model to a three interacting spin system where only one is coupled to the environment. Beyond a critical interaction the two spins not coupled to the environment oscillate with the bare Rabi frequency and relax more slowly. This effect is more pronounced when the anisotropy of the system-environment (SE) interaction goes from a purely XY to an Ising interaction form

  3. Establishment and verification of three-dimensional dynamic model for heavy-haul train-track coupled system

    Science.gov (United States)

    Liu, Pengfei; Zhai, Wanming; Wang, Kaiyun

    2016-11-01

    For the long heavy-haul train, the basic principles of the inter-vehicle interaction and train-track dynamic interaction are analysed firstly. Based on the theories of train longitudinal dynamics and vehicle-track coupled dynamics, a three-dimensional (3-D) dynamic model of the heavy-haul train-track coupled system is established through a modularised method. Specifically, this model includes the subsystems such as the train control, the vehicle, the wheel-rail relation and the line geometries. And for the calculation of the wheel-rail interaction force under the driving or braking conditions, the large creep phenomenon that may occur within the wheel-rail contact patch is considered. For the coupler and draft gear system, the coupler forces in three directions and the coupler lateral tilt angles in curves are calculated. Then, according to the characteristics of the long heavy-haul train, an efficient solving method is developed to improve the computational efficiency for such a large system. Some basic principles which should be followed in order to meet the requirement of calculation accuracy are determined. Finally, the 3-D train-track coupled model is verified by comparing the calculated results with the running test results. It is indicated that the proposed dynamic model could simulate the dynamic performance of the heavy-haul train well.

  4. Theoretical calculations and experimental verification for the pumping effect caused by the dynamic micro-tapered angle

    Science.gov (United States)

    Cai, Yufei; Zhang, Jianhui; Zhu, Chunling; Huang, Jun; Jiang, Feng

    2016-05-01

    The atomizer with micro cone apertures has advantages of ultra-fine atomized droplets, low power consumption and low temperature rise. The current research of this kind of atomizer mainly focuses on the performance and its application while there is less research of the principle of the atomization. Under the analysis of the dispenser and its micro-tapered aperture's deformation, the volume changes during the deformation and vibration of the micro-tapered aperture on the dispenser are calculated by coordinate transformation. Based on the characters of the flow resistance in a cone aperture, it is found that the dynamic cone angle results from periodical changes of the volume of the micro-tapered aperture of the atomizer and this change drives one-way flows. Besides, an experimental atomization platform is established to measure the atomization rates with different resonance frequencies of the cone aperture atomizer. The atomization performances of cone aperture and straight aperture atomizers are also measured. The experimental results show the existence of the pumping effect of the dynamic tapered angle. This effect is usually observed in industries that require low dispersion and micro- and nanoscale grain sizes, such as during production of high-pressure nozzles and inhalation therapy. Strategies to minimize the pumping effect of the dynamic cone angle or improve future designs are important concerns. This research proposes that dynamic micro-tapered angle is an important cause of atomization of the atomizer with micro cone apertures.

  5. Signatures of fission dynamics in highly excited nuclei produced in 197AU(800 A MeV) on proton collisions

    International Nuclear Information System (INIS)

    Benlliure, J.; Armbruster, P.; Bernas, M.

    2001-09-01

    197 Au(800 A MeV)-on-proton collisions are used to investigate the fission dynamics at high excitation energy. The kinematic properties together with the isotopic identification of the fission fragments allow to determine the mass, charge and excitation energy of the fissioning nucleus at saddle. The comparison of these observables and the measured total fission cross section with model calculations evidences a clear hindrance of fission at high excitation energy that can be explained in terms of nuclear dissipation. Assuming a statistical evaporation for other de-excitation channels than fission, an estimated value of the transient time of fission of (3 ± 1) . 10 -21 s is obtained. (orig.)

  6. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  7. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes; Identificacao de assinaturas de uranio em amostras de esfregacos (swipe samples) para verificacao de atividades nucleares para fins de salvaguardas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Pestana, Rafael Cardoso Baptistini

    2013-07-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO{sub 2}F{sub 2} and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the {sup 235}U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope {sup 235}U, as well as the use of reprocessed material, through the identification of the {sup 236}U isotope. The uncertainties obtained for the n({sup 235}U)/n({sup 238}U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  8. Development and verification of coupled fluid-structural dynamic codes for stress analysis of reactor vessel internals under blowdown loading

    International Nuclear Information System (INIS)

    Krieg, R.; Schlechtendahl, E.G.

    1977-01-01

    YAQUIR has been applied to large PWR blowdown problems and compared with LECK results. The structural model of CYLDY2 and the fluid model of YAQUIR have been coupled in the code STRUYA. First tests with the fluid dynamic systems code FLUST have been successful. The incompressible fluid version of the 3D coupled code FLUX for HDR-geometry was checked against some analytical test cases and was used for evaluation of the eigenfrequencies of the coupled system. Several test cases were run with the two phase flow code SOLA-DF with satisfactory results. Remarkable agreement was found between YAQUIR results and experimental data obtained from shallow water analogy experiments. A test for investigation of nonequilibrium twophase flow dynamics has been specified in some detail. The test is to be performed early 1978 in the water loop of the IRB. Good agreement was found between the natural frequency predictions for the core barrel obtained from CYLDY2 and STRUDL/DYNAL. Work started on improvement of the beam mode treatment in CYLDY2. The name of this modified version will be CYLDY3. The fluiddynamic code SING1, based on an advanced singularity method and applicable to a broad class of highly transient, incompressible 3D-problems with negligible viscosity has been developed and tested. It will be used in connection with the planned laboratory experiments in order to investigate the effect of the core structure on the blowdown process. Coupling of SING1 with structural dynamics is on the way. (orig./RW) [de

  9. The Lipopolysaccharide-Induced Metabolome Signature in Arabidopsis thaliana Reveals Dynamic Reprogramming of Phytoalexin and Phytoanticipin Pathways

    Science.gov (United States)

    Finnegan, Tarryn; Steenkamp, Paul A.; Piater, Lizelle A.

    2016-01-01

    Lipopolysaccharides (LPSs), as MAMP molecules, trigger the activation of signal transduction pathways involved in defence. Currently, plant metabolomics is providing new dimensions into understanding the intracellular adaptive responses to external stimuli. The effect of LPS on the metabolomes of Arabidopsis thaliana cells and leaf tissue was investigated over a 24 h period. Cellular metabolites and those secreted into the medium were extracted with methanol and liquid chromatography coupled to mass spectrometry was used for quantitative and qualitative analyses. Multivariate statistical data analyses were used to extract interpretable information from the generated multidimensional LC-MS data. The results show that LPS perception triggered differential changes in the metabolomes of cells and leaves, leading to variation in the biosynthesis of specialised secondary metabolites. Time-dependent changes in metabolite profiles were observed and biomarkers associated with the LPS-induced response were tentatively identified. These include the phytohormones salicylic acid and jasmonic acid, and also the associated methyl esters and sugar conjugates. The induced defensive state resulted in increases in indole—and other glucosinolates, indole derivatives, camalexin as well as cinnamic acid derivatives and other phenylpropanoids. These annotated metabolites indicate dynamic reprogramming of metabolic pathways that are functionally related towards creating an enhanced defensive capacity. The results reveal new insights into the mode of action of LPS as an activator of plant innate immunity, broadens knowledge about the defence metabolite pathways involved in Arabidopsis responses to LPS, and identifies specialised metabolites of functional importance that can be employed to enhance immunity against pathogen infection. PMID:27656890

  10. Cross-code gyrokinetic verification and benchmark on the linear collisionless dynamics of the geodesic acoustic mode

    Science.gov (United States)

    Biancalani, A.; Bottino, A.; Ehrlacher, C.; Grandgirard, V.; Merlo, G.; Novikau, I.; Qiu, Z.; Sonnendrücker, E.; Garbet, X.; Görler, T.; Leerink, S.; Palermo, F.; Zarzoso, D.

    2017-06-01

    The linear properties of the geodesic acoustic modes (GAMs) in tokamaks are investigated by means of the comparison of analytical theory and gyrokinetic numerical simulations. The dependence on the value of the safety factor, finite-orbit-width of the ions in relation to the radial mode width, magnetic-flux-surface shaping, and electron/ion mass ratio are considered. Nonuniformities in the plasma profiles (such as density, temperature, and safety factor), electro-magnetic effects, collisions, and the presence of minority species are neglected. Also, only linear simulations are considered, focusing on the local dynamics. We use three different gyrokinetic codes: the Lagrangian (particle-in-cell) code ORB5, the Eulerian code GENE, and semi-Lagrangian code GYSELA. One of the main aims of this paper is to provide a detailed comparison of the numerical results and analytical theory, in the regimes where this is possible. This helps understanding better the behavior of the linear GAM dynamics in these different regimes, the behavior of the codes, which is crucial in the view of a future work where more physics is present, and the regimes of validity of each specific analytical dispersion relation.

  11. The Lipopolysaccharide-Induced Metabolome Signature in Arabidopsis thaliana Reveals Dynamic Reprogramming of Phytoalexin and Phytoanticipin Pathways.

    Directory of Open Access Journals (Sweden)

    Tarryn Finnegan

    Full Text Available Lipopolysaccharides (LPSs, as MAMP molecules, trigger the activation of signal transduction pathways involved in defence. Currently, plant metabolomics is providing new dimensions into understanding the intracellular adaptive responses to external stimuli. The effect of LPS on the metabolomes of Arabidopsis thaliana cells and leaf tissue was investigated over a 24 h period. Cellular metabolites and those secreted into the medium were extracted with methanol and liquid chromatography coupled to mass spectrometry was used for quantitative and qualitative analyses. Multivariate statistical data analyses were used to extract interpretable information from the generated multidimensional LC-MS data. The results show that LPS perception triggered differential changes in the metabolomes of cells and leaves, leading to variation in the biosynthesis of specialised secondary metabolites. Time-dependent changes in metabolite profiles were observed and biomarkers associated with the LPS-induced response were tentatively identified. These include the phytohormones salicylic acid and jasmonic acid, and also the associated methyl esters and sugar conjugates. The induced defensive state resulted in increases in indole-and other glucosinolates, indole derivatives, camalexin as well as cinnamic acid derivatives and other phenylpropanoids. These annotated metabolites indicate dynamic reprogramming of metabolic pathways that are functionally related towards creating an enhanced defensive capacity. The results reveal new insights into the mode of action of LPS as an activator of plant innate immunity, broadens knowledge about the defence metabolite pathways involved in Arabidopsis responses to LPS, and identifies specialised metabolites of functional importance that can be employed to enhance immunity against pathogen infection.

  12. Un système de vérification de signature manuscrite en ligne basé ...

    African Journals Online (AJOL)

    Administrateur

    online handwritten signature verification system. We model the handwritten signature by an analytical approach based on the Empirical Mode Decomposition (EMD). The organized system is provided with a training module and a base of signatures. The implemented evaluation protocol points out the interest of the adopted ...

  13. Quantum blind dual-signature scheme without arbitrator

    International Nuclear Information System (INIS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-01-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology. (paper)

  14. Quantum blind dual-signature scheme without arbitrator

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  15. Simulating Dynamic Stall in a 2D VAWT: Modeling strategy, verification and validation with Particle Image Velocimetry data

    International Nuclear Information System (INIS)

    Ferreira, C J Simao; Bijl, H; Bussel, G van; Kuik, G van

    2007-01-01

    The implementation of wind energy conversion systems in the built environment renewed the interest and the research on Vertical Axis Wind Turbines (VAWT), which in this application present several advantages over Horizontal Axis Wind Turbines (HAWT). The VAWT has an inherent unsteady aerodynamic behavior due to the variation of angle of attack with the angle of rotation, perceived velocity and consequentially Reynolds number. The phenomenon of dynamic stall is then an intrinsic effect of the operation of a Vertical Axis Wind Turbine at low tip speed ratios, having a significant impact in both loads and power. The complexity of the unsteady aerodynamics of the VAWT makes it extremely attractive to be analyzed using Computational Fluid Dynamics (CFD) models, where an approximation of the continuity and momentum equations of the Navier-Stokes equations set is solved. The complexity of the problem and the need for new design approaches for VAWT for the built environment has driven the authors of this work to focus the research of CFD modeling of VAWT on: .comparing the results between commonly used turbulence models: URANS (Spalart-Allmaras and k-ε) and large eddy models (Large Eddy Simulation and Detached Eddy Simulation) .verifying the sensitivity of the model to its grid refinement (space and time), .evaluating the suitability of using Particle Image Velocimetry (PIV) experimental data for model validation. The 2D model created represents the middle section of a single bladed VAWT with infinite aspect ratio. The model simulates the experimental work of flow field measurement using Particle Image Velocimetry by Simao Ferreira et al for a single bladed VAWT. The results show the suitability of the PIV data for the validation of the model, the need for accurate simulation of the large eddies and the sensitivity of the model to grid refinement

  16. Length-scale dependent mechanical properties of Al-Cu eutectic alloy: Molecular dynamics based model and its experimental verification

    Science.gov (United States)

    Tiwary, C. S.; Chakraborty, S.; Mahapatra, D. R.; Chattopadhyay, K.

    2014-05-01

    This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al2Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al2Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.

  17. A verification plan of rotor dynamics of turbo-machine system supported by AMB for the GTHTR300 (step 1)

    International Nuclear Information System (INIS)

    Takizuka, Takakazu; Takada, Shoji; Kosugiyama, Shinichi; Kunitomi, Kazuhiko; Xing, Yan

    2003-01-01

    A program for research and development on magnetic bearing suspended rotor dynamics was planned for the Gas Turbine High Temperature Reactor (GTHTR300) plant. The magnetic bearing to suspend the turbo-machine rotor of GTHTR300 is unique because the required load capacity is much larger than that of existing one. The turbo-compressor and generator rotors, each of which is suspended by two radial bearings, pass over the first and the second critical speeds to bending mode, respectively. The rotor design fulfilled the standard limit of vibration amplitude of 75 μm at the rated rotational speed by optimising the stiffness of the magnetic bearings. A test apparatus was designed in the program. The test apparatus is composed of 1/3-scale two test rotors simulating turbo-compressor and generator rotors which are connected by a flexible coupling or a rigid coupling. Because the load capacity is almost in proportion to the projected area of the bearing, the load capacity of the magnetic bearing is almost 1/10 of the actual one. The test rotors were designed so that their critical speeds and vibration modes match the actual ones. The test will verify the rotor design and demonstrate the rotor vibration control technology of the magnetic bearing. This paper shows the outline of the test device and the test plan for the magnetic bearing suspended rotor system. The present study is entrusted from the Ministry of Education, Culture, Sports, Science and Technology of Japan. (author)

  18. Experimental verification of a thermal equivalent circuit dynamic model on an extended range electric vehicle battery pack

    Science.gov (United States)

    Ramotar, Lokendra; Rohrauer, Greg L.; Filion, Ryan; MacDonald, Kathryn

    2017-03-01

    The development of a dynamic thermal battery model for hybrid and electric vehicles is realized. A thermal equivalent circuit model is created which aims to capture and understand the heat propagation from the cells through the entire pack and to the environment using a production vehicle battery pack for model validation. The inclusion of production hardware and the liquid battery thermal management system components into the model considers physical and geometric properties to calculate thermal resistances of components (conduction, convection and radiation) along with their associated heat capacity. Various heat sources/sinks comprise the remaining model elements. Analog equivalent circuit simulations using PSpice are compared to experimental results to validate internal temperature nodes and heat rates measured through various elements, which are then employed to refine the model further. Agreement with experimental results indicates the proposed method allows for a comprehensive real-time battery pack analysis at little computational expense when compared to other types of computer based simulations. Elevated road and ambient conditions in Mesa, Arizona are simulated on a parked vehicle with varying quiescent cooling rates to examine the effect on the diurnal battery temperature for longer term static exposure. A typical daily driving schedule is also simulated and examined.

  19. A feature based comparison of pen and swipe based signature characteristics.

    Science.gov (United States)

    Robertson, Joshua; Guest, Richard

    2015-10-01

    Dynamic Signature Verification (DSV) is a biometric modality that identifies anatomical and behavioral characteristics when an individual signs their name. Conventionally signature data has been captured using pen/tablet apparatus. However, the use of other devices such as the touch-screen tablets has expanded in recent years affording the possibility of assessing biometric interaction on this new technology. To explore the potential of employing DSV techniques when a user signs or swipes with their finger, we report a study to correlate pen and finger generated features. Investigating the stability and correlation between a set of characteristic features recorded in participant's signatures and touch-based swipe gestures, a statistical analysis was conducted to assess consistency between capture scenarios. The results indicate that there is a range of static and dynamic features such as the rate of jerk, size, duration and the distance the pen traveled that can lead to interoperability between these two systems for input methods for use within a potential biometric context. It can be concluded that this data indicates that a general principle is that the same underlying constructional mechanisms are evident. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Dynamic changes in spectral and spatial signatures of high frequency oscillations in rat hippocampi during epileptogenesis in acute and chronic stages

    Directory of Open Access Journals (Sweden)

    Pan-Pan Song

    2016-11-01

    Full Text Available Objective: To analyze spectral and spatial signatures of high frequency oscillations (HFOs, which include ripples and fast ripples (FRs, > 200 Hz by quantitatively assessing average and peak spectral power in a rat model of different stages of epileptogenesis.Methods: The lithium–pilocarpine model of temporal lobe epilepsy was used. The acute phase of epilepsy was assessed by recording intracranial electroencephalography (EEG activity for 1 day after status epilepticus (SE. The chronic phase of epilepsy, including spontaneous recurrent seizures (SRSs, was assessed by recording EEG activity for 28 days after SE. Average and peak spectral power of five frequency bands of EEG signals in CA1, CA3 and DG regions of the hippocampus were analyzed with wavelet and digital filter.Results: FRs occurred in the hippocampus in the animal model. Significant dynamic changes in the spectral power of FRS were identified in CA1 and CA3. The average spectral power of ripples increased at 20 min before SE (p < 0.05, peaked at 10 min before diazepam injection. It decreased at 10 min after diazepam (p < 0.05 and returned to baseline after 1 hour (h. The average spectral power of FRs increased at 30 min before SE (p < 0.05 and peaked at 10 min before diazepam. It decreased at 10 min after diazepam (p < 0.05 and returned to baseline at 2 h after injection. The dynamic changes were similar between average and peak spectral power of FRs. Average and peak spectral power of both ripples and FRs in the chronic phase showed a gradual downward trend compared with normal rats 14 days after SE.Significance: The spectral power of HFOs may be utilized to distinguish between normal and pathologic HFOs. Ictal average and peak spectral power of FRs were two parameters for predicting acute epileptic seizures, which could be used as a new quantitative biomarker and early warning marker of seizure. Changes in interictal HFOs power in the hippocampus at the chronic stage may be not

  1. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  2. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  3. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  4. Review and Analysis of Cryptographic Schemes Implementing Threshold Signature

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-03-01

    Full Text Available This work is devoted to the study of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, ellipt ic curves and bilinear pairings were investigated. Different methods of generation and verification of threshold signatures were explored, e.g. used in a mobile agents, Internet banking and e-currency. The significance of the work is determined by the reduction of the level of counterfeit electronic documents, signed by certain group of users.

  5. Improvement of a Quantum Proxy Blind Signature Scheme

    Science.gov (United States)

    Zhang, Jia-Lei; Zhang, Jian-Zhong; Xie, Shu-Cui

    2018-06-01

    Improvement of a quantum proxy blind signature scheme is proposed in this paper. Six-qubit entangled state functions as quantum channel. In our scheme, a trust party Trent is introduced so as to avoid David's dishonest behavior. The receiver David verifies the signature with the help of Trent in our scheme. The scheme uses the physical characteristics of quantum mechanics to implement message blinding, delegation, signature and verification. Security analysis proves that our scheme has the properties of undeniability, unforgeability, anonymity and can resist some common attacks.

  6. Signature-based User Authentication

    OpenAIRE

    Hámorník, Juraj

    2015-01-01

    This work aims on missing handwritten signature authentication in Windows. Result of this work is standalone software that allow users to log into Windows by writing signature. We focus on security of signature authentification and best overall user experience. We implemented signature authentification service that accept signature and return user access token if signature is genuine. Signature authentification is done by comparing given signature to signature patterns by their similarity. Si...

  7. Electronic Signature Policy

    Science.gov (United States)

    Establishes the United States Environmental Protection Agency's approach to adopting electronic signature technology and best practices to ensure electronic signatures applied to official Agency documents are legally valid and enforceable

  8. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  9. Exotic signatures from supersymmetry

    International Nuclear Information System (INIS)

    Hall, L.J.

    1989-08-01

    Minor changes to the standard supersymmetric model, such as soft flavor violation and R parity violation, cause large changes in the signatures. The origin of these changes and the resulting signatures are discussed. 15 refs., 7 figs., 2 tabs

  10. Extending the similarity-based XML multicast approach with digital signatures

    DEFF Research Database (Denmark)

    Azzini, Antonia; Marrara, Stefania; Jensen, Meiko

    2009-01-01

    This paper investigates the interplay between similarity-based SOAP message aggregation and digital signature application. An overview on the approaches resulting from the different orders for the tasks of signature application, verification, similarity aggregation and splitting is provided....... Depending on the intersection between similarity-aggregated and signed SOAP message parts, the paper discusses three different cases of signature application, and sketches their applicability and performance implications....

  11. Blinding for unanticipated signatures

    NARCIS (Netherlands)

    D. Chaum (David)

    1987-01-01

    textabstractPreviously known blind signature systems require an amount of computation at least proportional to the number of signature types, and also that the number of such types be fixed in advance. These requirements are not practical in some applications. Here, a new blind signature technique

  12. Fair quantum blind signatures

    International Nuclear Information System (INIS)

    Tian-Yin, Wang; Qiao-Yan, Wen

    2010-01-01

    We present a new fair blind signature scheme based on the fundamental properties of quantum mechanics. In addition, we analyse the security of this scheme, and show that it is not possible to forge valid blind signatures. Moreover, comparisons between this scheme and public key blind signature schemes are also discussed. (general)

  13. Real Traceable Signatures

    Science.gov (United States)

    Chow, Sherman S. M.

    Traceable signature scheme extends a group signature scheme with an enhanced anonymity management mechanism. The group manager can compute a tracing trapdoor which enables anyone to test if a signature is signed by a given misbehaving user, while the only way to do so for group signatures requires revealing the signer of all signatures. Nevertheless, it is not tracing in a strict sense. For all existing schemes, T tracing agents need to recollect all N' signatures ever produced and perform RN' “checks” for R revoked users. This involves a high volume of transfer and computations. Increasing T increases the degree of parallelism for tracing but also the probability of “missing” some signatures in case some of the agents are dishonest.

  14. Continuous-variable quantum homomorphic signature

    Science.gov (United States)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  15. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  16. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  17. Unconditionally Secure Quantum Signatures

    Directory of Open Access Journals (Sweden)

    Ryan Amiri

    2015-08-01

    Full Text Available Signature schemes, proposed in 1976 by Diffie and Hellman, have become ubiquitous across modern communications. They allow for the exchange of messages from one sender to multiple recipients, with the guarantees that messages cannot be forged or tampered with and that messages also can be forwarded from one recipient to another without compromising their validity. Signatures are different from, but no less important than encryption, which ensures the privacy of a message. Commonly used signature protocols—signatures based on the Rivest–Adleman–Shamir (RSA algorithm, the digital signature algorithm (DSA, and the elliptic curve digital signature algorithm (ECDSA—are only computationally secure, similar to public key encryption methods. In fact, since these rely on the difficulty of finding discrete logarithms or factoring large primes, it is known that they will become completely insecure with the emergence of quantum computers. We may therefore see a shift towards signature protocols that will remain secure even in a post-quantum world. Ideally, such schemes would provide unconditional or information-theoretic security. In this paper, we aim to provide an accessible and comprehensive review of existing unconditionally securesecure signature schemes for signing classical messages, with a focus on unconditionally secure quantum signature schemes.

  18. Radar Signature Calculation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The calculation, analysis, and visualization of the spatially extended radar signatures of complex objects such as ships in a sea multipath environment and...

  19. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  20. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  1. Electronic health records: what does your signature signify?

    Directory of Open Access Journals (Sweden)

    Victoroff MD Michael S

    2012-08-01

    Full Text Available Abstract Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information.

  2. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  3. Modeling the lexical morphology of Western handwritten signatures.

    Directory of Open Access Journals (Sweden)

    Moises Diaz-Cabrera

    Full Text Available A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures.

  4. Dynamic Behavior of a SCARA Robot by using N-E Method for a Straight Line and Simulation of Motion by using Solidworks and Verification by Matlab/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2014-05-01

    Full Text Available SCARA (Selective Compliant Assembly Robot Arm robot of serial architecture is widely used in assembly operations and operations "pick-place", it has been shown that use of robots improves the accuracy of assembly, and saves assembly time and cost as well. The most important condition for the choice of this kind of robot is the dynamic behavior for a given path, no closed solution for the dynamics of this important robot has been reported. This paper presents the study of the kinematics (forward and inverse by using D-H notation and the dynamics of SCARA robot by using N-E methods. A computer code is developed for trajectory generation by using inverse kinematics, and calculates the variations of the torques of the links for a straight line (path rest to rest between two positions for operation "pick-place". SCARA robot is constructed to achieve “pick-place» operation using SolidWorks software. And verification by Matlab/Simulink. The results of simulations were discussed. An agreement between the two softwares is certainly obtained herein

  5. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  6. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  7. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  8. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  9. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  10. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  11. Advanced Missile Signature Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Missile Signature Center (AMSC) is a national facility supporting the Missile Defense Agency (MDA) and other DoD programs and customers with analysis,...

  12. THE ELECTRONIC SIGNATURE

    Directory of Open Access Journals (Sweden)

    Voiculescu Madalina Irena

    2009-05-01

    Full Text Available Article refers to significance and the digital signature in electronic commerce. Internet and electronic commerce open up many new opportunities for the consumer, yet, the security (or perceived lack of security of exchanging personal and financial data

  13. Digital signature feasibility study

    Science.gov (United States)

    2008-06-01

    The purpose of this study was to assess the advantages and disadvantages of using digital signatures to assist the Arizona Department of Transportation in conducting business. The Department is evaluating the potential of performing more electronic t...

  14. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  15. Physics Signatures at CLIC

    CERN Document Server

    Battaglia, Marco

    2001-01-01

    A set of signatures for physics processes of potential interests for the CLIC programme at = 1 - 5 TeV are discussed. These signatures, that may correspond to the manifestation of different scenarios of new physics as well as to Standard Model precision tests, are proposed as benchmarks for the optimisation of the CLIC accelerator parameters and for a first definition of the required detector response.

  16. Hybrid Enrichment Verification Array: Module Characterization Studies

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  17. A Quantum Multi-Proxy Weak Blind Signature Scheme Based on Entanglement Swapping

    Science.gov (United States)

    Yan, LiLi; Chang, Yan; Zhang, ShiBin; Han, GuiHua; Sheng, ZhiWei

    2017-02-01

    In this paper, we present a multi-proxy weak blind signature scheme based on quantum entanglement swapping of Bell states. In the scheme, proxy signers can finish the signature instead of original singer with his/her authority. It can be applied to the electronic voting system, electronic paying system, etc. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. It could guarantee not only the unconditionally security but also the anonymity of the message owner. The security analysis shows the scheme satisfies the security features of multi-proxy weak signature, singers cannot disavowal his/her signature while the signature cannot be forged by others, and the message owner can be traced.

  18. Network-based Arbitrated Quantum Signature Scheme with Graph State

    Science.gov (United States)

    Ma, Hongling; Li, Fei; Mao, Ningyi; Wang, Yijun; Guo, Ying

    2017-08-01

    Implementing an arbitrated quantum signature(QAS) through complex networks is an interesting cryptography technology in the literature. In this paper, we propose an arbitrated quantum signature for the multi-user-involved networks, whose topological structures are established by the encoded graph state. The determinative transmission of the shared keys, is enabled by the appropriate stabilizers performed on the graph state. The implementation of this scheme depends on the deterministic distribution of the multi-user-shared graph state on which the encoded message can be processed in signing and verifying phases. There are four parties involved, the signatory Alice, the verifier Bob, the arbitrator Trent and Dealer who assists the legal participants in the signature generation and verification. The security is guaranteed by the entanglement of the encoded graph state which is cooperatively prepared by legal participants in complex quantum networks.

  19. Sediment transport dynamics in the Central Himalaya: assessing during monsoon the erosion processes signature in the daily suspended load of the Narayani river

    Science.gov (United States)

    Morin, Guillaume; Lavé, Jérôme; Lanord, Christian France; Prassad Gajurel, Ananta

    2017-04-01

    The evolution of mountainous landscapes is the result of competition between tectonic and erosional processes. In response to the creation of topography by tectonics, fluvial, glacial, and hillslope denudation processes erode topography, leading to rock exhumation and sediment redistribution. When trying to better document the links between climate, tectonic, or lithologic controls in mountain range evolution, a detailed understanding of the influence of each erosion process in a given environment is fundamental. At the scale of a whole mountain range, a systematic survey and monitoring of all the geomorphologic processes at work can rapidly become difficult. An alternative approach can be provided by studying the characteristics and temporal evolution of the sediments exported out of the range. In central Himalaya, the Narayani watershed presents contrasted lithologic, geochemical or isotopic signatures of the outcropping rocks as well as of the erosional processes: this particular setting allows conducting such type of approach by partly untangling the myopic vision of the spatial integration at the watershed scale. Based on the acquisition and analysis of a new dataset on the daily suspended load concentration and geochemical characteristics at the mountain outlet of one of the largest Himalayan rivers (drainage area = 30000 km2) bring several important results on Himalayan erosion, and on climatic and process controls. 1. Based on discrete depth sampling and on daily surface sampling of suspended load associated to flow characterization through ADCP measurements, we were first able to integrate sediment flux across a river cross-section and over time. We estimate for 2010 year an equivalent erosion rate of 1.8 +0.35/-0.2 mm/yr, and over the last 15 years, using past sediment load records from the DHM of Nepal, an equivalent erosion rate of 1.6 +0.3/-0.2 mm/yr. These rates are also in close agreement with the longer term ( 500 yrs) denudation rates of 1.7 mm

  20. Signature of biased range in the non-dynamical Chern-Simons modified gravity and its measurements with satellite-satellite tracking missions: theoretical studies

    Science.gov (United States)

    Qiang, Li-E.; Xu, Peng

    2015-08-01

    Having great accuracy in the range and range rate measurements, the GRACE mission and the planed GRACE follow on mission can in principle be employed to place strong constraints on certain relativistic gravitational theories. In this paper, we work out the range observable of the non-dynamical Chern-Simons modified gravity for the satellite-to-satellite tracking (SST) measurements. We find out that a characteristic time accumulating range signal appears in non-dynamical Chern-Simons gravity, which has no analogue found in the standard parity-preserving metric theories of gravity. The magnitude of this Chern-Simons range signal will reach a few times of cm for each free flight of these SST missions, here is the dimensionless post-Newtonian parameter of the non-dynamical Chern-Simons theory. Therefore, with the 12 years data of the GRACE mission, one expects that the mass scale of the non-dynamical Chern-Simons gravity could be constrained to be larger than eV. For the GRACE FO mission that scheduled to be launched in 2017, the much stronger bound that eV is expected.

  1. Entanglement as a signature of quantum chaos.

    Science.gov (United States)

    Wang, Xiaoguang; Ghose, Shohini; Sanders, Barry C; Hu, Bambi

    2004-01-01

    We explore the dynamics of entanglement in classically chaotic systems by considering a multiqubit system that behaves collectively as a spin system obeying the dynamics of the quantum kicked top. In the classical limit, the kicked top exhibits both regular and chaotic dynamics depending on the strength of the chaoticity parameter kappa in the Hamiltonian. We show that the entanglement of the multiqubit system, considered for both the bipartite and the pairwise entanglement, yields a signature of quantum chaos. Whereas bipartite entanglement is enhanced in the chaotic region, pairwise entanglement is suppressed. Furthermore, we define a time-averaged entangling power and show that this entangling power changes markedly as kappa moves the system from being predominantly regular to being predominantly chaotic, thus sharply identifying the edge of chaos. When this entangling power is averaged over all states, it yields a signature of global chaos. The qualitative behavior of this global entangling power is similar to that of the classical Lyapunov exponent.

  2. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  3. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  4. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  5. Practical quantum digital signature

    Science.gov (United States)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  6. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  7. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  10. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  11. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  12. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    Science.gov (United States)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  13. Identification of the Process of Dynamic Stretching of Threads in Warp Knitting Technology Part II: Experimental Identification of the Process of Stretching Threads, with Verification of Rheological Models

    Directory of Open Access Journals (Sweden)

    Prążyńska Aleksandra

    2018-03-01

    Full Text Available The study is a continuation of the first part of the publication, concerning the theoretical analysis of sensitivity of rheological models of dynamically stretched thread. This part presents the experimental research on the characteristics of stretching forces as a function of time, in the context of comparing the obtained results with theoretical data.

  14. Spectral simulations and vibrational dynamics of the fluxional H+5 cation and its isotopologues: signatures of the shared-proton motions

    International Nuclear Information System (INIS)

    Prosmiti, Rita; Valdés, Álvaro; Delgado-Barrio, Gerardo

    2014-01-01

    The recent increased interest on research studies of the H + 5 cation, and its isotopologues, is due to the postulation for their presence, although still not detected, in the interstellar medium. There is no doubt, particularly in the light of the recent laboratory observations, that the spectroscopy of these systems is also a great challenge for the theorists. Thus, we report the first fully converged coupled anharmonic quantum study on vibrational dynamics of these highly fluxional cations, providing important information on their spectroscopy in a rigorous manner, and open perspectives for further investigations

  15. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  16. Signatures of the Invisible

    CERN Multimedia

    Strom, D

    2003-01-01

    On the Net it is possible to take a look at art from afar via Virtual Museums. One such exhibition was recently in the New York Museum of Modern Art's branch, PS1. Entitled 'Signatures of the Invisible' it was a collaborative effort between artists and physicists (1/2 page).

  17. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  18. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  19. Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading.

    CSIR Research Space (South Africa)

    Joughin, WC

    2002-04-01

    Full Text Available and polypropylene fibre reinforced shotcrete compared to mesh reinforced shotcrete in tunnels subject to high stresses and convergence and possibly, to dynamic loading. In particular: • A direct comparison of the in situ performance of mesh reinforced shotcrete... with that of steel and polypropylene fibre reinforced shotcrete; • Confirmation that the performance of fibre reinforced shotcrete matches the performance of mesh reinforced shotcrete under large deformation; • A comparative basis for theoretical analysis...

  20. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  1. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  2. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  3. Identification of host response signatures of infection.

    Energy Technology Data Exchange (ETDEWEB)

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  4. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  5. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  6. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  7. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  8. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  9. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  10. Performance evaluation of an improved optical computed tomography polymer gel dosimeter system for 3D dose verification of static and dynamic phantom deliveries

    International Nuclear Information System (INIS)

    Lopatiuk-Tirpak, O.; Langen, K. M.; Meeks, S. L.; Kupelian, P. A.; Zeidan, O. A.; Maryanski, M. J.

    2008-01-01

    The performance of a next-generation optical computed tomography scanner (OCTOPUS-5X) is characterized in the context of three-dimensional gel dosimetry. Large-volume (2.2 L), muscle-equivalent, radiation-sensitive polymer gel dosimeters (BANG-3) were used. Improvements in scanner design leading to shorter acquisition times are discussed. The spatial resolution, detectable absorbance range, and reproducibility are assessed. An efficient method for calibrating gel dosimeters using the depth-dose relationship is applied, with photon- and electron-based deliveries yielding equivalent results. A procedure involving a preirradiation scan was used to reduce the edge artifacts in reconstructed images, thereby increasing the useful cross-sectional area of the dosimeter by nearly a factor of 2. Dose distributions derived from optical density measurements using the calibration coefficient show good agreement with the treatment planning system simulations and radiographic film measurements. The feasibility of use for motion (four-dimensional) dosimetry is demonstrated on an example comparing dose distributions from static and dynamic delivery of a single-field photon plan. The capability to visualize three-dimensional dose distributions is also illustrated

  11. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  12. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  13. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  14. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  15. Signatures of topological superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Yang

    2017-07-19

    The prediction and experimental discovery of topological insulators brought the importance of topology in condensed matter physics into the limelight. Topology hence acts as a new dimension along which more and more new states of matter start to emerge. One of these topological states of matter, namely topological superconductors, comes into the focus because of their gapless excitations. These gapless excitations, especially in one dimensional topological superconductors, are Majorana zero modes localized at the ends of the superconductor and exhibit exotic nonabelian statistics, which can be potentially applied to fault-tolerant quantum computation. Given their highly interesting physical properties and potential applications to quantum computation, both theorists and experimentalists spend great efforts to realize topological supercondoctors and to detect Majoranas. In two projects within this thesis, we investigate the properties of Majorana zero modes in realistic materials which are absent in simple theoretical models. We find that the superconducting proximity effect, an essential ingredient in all existing platforms for topological superconductors, plays a significant role in determining the localization property of the Majoranas. Strong proximity coupling between the normal system and the superconducting substrate can lead to strongly localized Majoranas, which can explain the observation in a recent experiment. Motivated by experiments in Molenkamp's group, we also look at realistic quantum spin Hall Josephson junctions, in which charge puddles acting as magnetic impurities are coupled to the helical edge states. We find that with this setup, the junction generically realizes an exotic 8π periodic Josephson effect, which is absent in a pristine Josephson junction. In another two projects, we propose more pronounced signatures of Majoranas that are accessible with current experimental techniques. The first one is a transport measurement, which uses

  16. Modem Signature Analysis.

    Science.gov (United States)

    1982-10-01

    AD-A127 993 MODEM SIGNATURE ANALISIS (U) PAR TECHNOLOGY CORP NEW / HARTFORD NY V EDWARDS ET AL. OCT 82 RADC-TR-82-269 F30602-80-C-0264 NCLASSIFIED F/G...as an indication of the class clustering and separation between different classes in the modem data base. It is apparent from the projection that the...that as the clusters disperse, the likelihood of a sample crossing the boundary into an adjacent region and causing a symbol decision error increases. As

  17. Modeling ground vehicle acoustic signatures for analysis and synthesis

    International Nuclear Information System (INIS)

    Haschke, G.; Stanfield, R.

    1995-01-01

    Security and weapon systems use acoustic sensor signals to classify and identify moving ground vehicles. Developing robust signal processing algorithms for this is expensive, particularly in presence of acoustic clutter or countermeasures. This paper proposes a parametric ground vehicle acoustic signature model to aid the system designer in understanding which signature features are important, developing corresponding feature extraction algorithms and generating low-cost, high-fidelity synthetic signatures for testing. The authors have proposed computer-generated acoustic signatures of armored, tracked ground vehicles to deceive acoustic-sensored smart munitions. They have developed quantitative measures of how accurately a synthetic acoustic signature matches those produced by actual vehicles. This paper describes parameters of the model used to generate these synthetic signatures and suggests methods for extracting these parameters from signatures of valid vehicle encounters. The model incorporates wide-bandwidth and narrow- bandwidth components that are modulated in a pseudo-random fashion to mimic the time dynamics of valid vehicle signatures. Narrow- bandwidth feature extraction techniques estimate frequency, amplitude and phase information contained in a single set of narrow frequency- band harmonics. Wide-bandwidth feature extraction techniques estimate parameters of a correlated-noise-floor model. Finally, the authors propose a method of modeling the time dynamics of the harmonic amplitudes as a means adding necessary time-varying features to the narrow-bandwidth signal components. The authors present results of applying this modeling technique to acoustic signatures recorded during encounters with one armored, tracked vehicle. Similar modeling techniques can be applied to security systems

  18. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  19. Electronic Signature (eSig)

    Data.gov (United States)

    Department of Veterans Affairs — Beginning with the Government Paperwork Elimination Act of 1998 (GPEA), the Federal government has encouraged the use of electronic / digital signatures to enable...

  20. Expressiveness considerations of XML signatures

    DEFF Research Database (Denmark)

    Jensen, Meiko; Meyer, Christopher

    2011-01-01

    XML Signatures are used to protect XML-based Web Service communication against a broad range of attacks related to man-in-the-middle scenarios. However, due to the complexity of the Web Services specification landscape, the task of applying XML Signatures in a robust and reliable manner becomes...... more and more challenging. In this paper, we investigate this issue, describing how an attacker can still interfere with Web Services communication even in the presence of XML Signatures. Additionally, we discuss the interrelation of XML Signatures and XML Encryption, focussing on their security...

  1. Electronic Warfare Signature Measurement Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Electronic Warfare Signature Measurement Facility contains specialized mobile spectral, radiometric, and imaging measurement systems to characterize ultraviolet,...

  2. Privacy in wireless sensor networks using ring signature

    Directory of Open Access Journals (Sweden)

    Ashmita Debnath

    2014-07-01

    Full Text Available The veracity of a message from a sensor node must be verified in order to avoid a false reaction by the sink. This verification requires the authentication of the source node. The authentication process must also preserve the privacy such that the node and the sensed object are not endangered. In this work, a ring signature was proposed to authenticate the source node while preserving its spatial privacy. However, other nodes as signers and their numbers must be chosen to preclude the possibility of a traffic analysis attack by an adversary. The spatial uncertainty increases with the number of signers but requires larger memory size and communication overhead. This requirement can breach the privacy of the sensed object. To determine the effectiveness of the proposed scheme, the location estimate of a sensor node by an adversary and enhancement in the location uncertainty with a ring signature was evaluated. Using simulation studies, the ring signature was estimated to require approximately four members from the same neighbor region of the source node to sustain the privacy of the node. Furthermore, the ring signature was also determined to have a small overhead and not to adversely affect the performance of the sensor network.

  3. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  4. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  5. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  6. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  7. Signature change events: a challenge for quantum gravity?

    International Nuclear Information System (INIS)

    White, Angela; Weinfurtner, Silke; Visser, Matt

    2010-01-01

    Within the framework of either Euclidean (functional integral) quantum gravity or canonical general relativity the signature of the manifold is a priori unconstrained. Furthermore, recent developments in the emergent spacetime programme have led to a physically feasible implementation of (analogue) signature change events. This suggests that it is time to revisit the sometimes controversial topic of signature change in general relativity. Specifically, we shall focus on the behaviour of a quantum field defined on a manifold containing regions of different signature. We emphasize that regardless of the underlying classical theory, there are severe problems associated with any quantum field theory residing on a signature-changing background. (Such as the production of what is naively an infinite number of particles, with an infinite energy density.) We show how the problem of quantum fields exposed to finite regions of Euclidean-signature (Riemannian) geometry has similarities with the quantum barrier penetration problem. Finally we raise the question as to whether signature change transitions could be fully understood and dynamically generated within (modified) classical general relativity, or whether they require the knowledge of a theory of quantum gravity.

  8. Signatures of Mechanosensitive Gating.

    Science.gov (United States)

    Morris, Richard G

    2017-01-10

    The question of how mechanically gated membrane channels open and close is notoriously difficult to address, especially if the protein structure is not available. This perspective highlights the relevance of micropipette-aspirated single-particle tracking-used to obtain a channel's diffusion coefficient, D, as a function of applied membrane tension, σ-as an indirect assay for determining functional behavior in mechanosensitive channels. While ensuring that the protein remains integral to the membrane, such methods can be used to identify not only the gating mechanism of a protein, but also associated physical moduli, such as torsional and dilational rigidity, which correspond to the protein's effective shape change. As an example, three distinct D-versus-σ "signatures" are calculated, corresponding to gating by dilation, gating by tilt, and gating by a combination of both dilation and tilt. Both advantages and disadvantages of the approach are discussed. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. Signatures de l'invisible

    CERN Multimedia

    CERN Press Office. Geneva

    2000-01-01

    "Signatures of the Invisible" is an unique collaboration between contemporary artists and contemporary physicists which has the potential to help redefine the relationship between science and art. "Signatures of the Invisible" is jointly organised by the London Institute - the world's largest college of art and design and CERN*, the world's leading particle physics laboratory. 12 leading visual artists:

  10. An interpretation of signature inversion

    International Nuclear Information System (INIS)

    Onishi, Naoki; Tajima, Naoki

    1988-01-01

    An interpretation in terms of the cranking model is presented to explain why signature inversion occurs for positive γ of the axially asymmetric deformation parameter and emerges into specific orbitals. By introducing a continuous variable, the eigenvalue equation can be reduced to a one dimensional Schroedinger equation by means of which one can easily understand the cause of signature inversion. (author)

  11. Cell short circuit, preshort signature

    Science.gov (United States)

    Lurie, C.

    1980-01-01

    Short-circuit events observed in ground test simulations of DSCS-3 battery in-orbit operations are analyzed. Voltage signatures appearing in the data preceding the short-circuit event are evaluated. The ground test simulation is briefly described along with performance during reconditioning discharges. Results suggest that a characteristic signature develops prior to a shorting event.

  12. Ship Signature Management System : Functionality

    NARCIS (Netherlands)

    Arciszewski, H.F.R.; Lier, L. van; Meijer, Y.G.S.; Noordkamp, H.W.; Wassenaar, A.S.

    2010-01-01

    A signature of a platform is the manner in which the platform manifests itself to a certain type of sensor and how observable it is when such a sensor is used to detect the platform. Because many military platforms use sensors in different media, it is the total of its different signatures that

  13. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  14. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  15. Temporal logic runtime verification of dynamic systems

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-07-01

    Full Text Available , this paper provides a novel framework that automatically and verifiably monitors these systems at runtime. The main aim of the framework is to assist the operator through witnesses and counterexamples that are generated during the execution of the system...

  16. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  17. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  18. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  19. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  20. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  1. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry; Seis anos de expereincia en la planificacion y verificacion de la IMRT dinamica con portal dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-07-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  2. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  3. Unusual ISS Rate Signature

    Science.gov (United States)

    Laible, Michael R.

    2011-01-01

    On November 23, 2011 International Space Station Guidance, Navigation, and Control reported unusual pitch rate disturbance. These disturbances were an order of magnitude greater than nominal rates. The Loads and Dynamics team was asked to review and analyze current accelerometer data to investigate this disturbance. This paper will cover the investigation process under taken by the Loads and Dynamics group. It will detail the accelerometers used and analysis performed. The analysis included performing Frequency Fourier Transform of the data to identify the mode of interest. This frequency data is then reviewed with modal analysis of the ISS system model. Once this analysis is complete and the disturbance quantified, a forcing function was produced to replicate the disturbance. This allows the Loads and Dynamics team to report the load limit values for the 100's of interfaces on the ISS.

  4. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  5. Quantum Digital Signatures for Unconditional Safe Authenticity Protection of Medical Documentation

    Directory of Open Access Journals (Sweden)

    Arkadiusz Liber

    2015-12-01

    Full Text Available Modern medical documentation appears most often in an online form which requires some digital methods to ensure its confidentiality, integrity and authenticity. The document authenticity may be secured with the use of a signature. A classical handwritten signature is directly related to its owner by his/her psychomotor character traits. Such a signature is also connected with the material it is written on, and a writing tool. Because of these properties, a handwritten signature reflects certain close material bonds between the owner and the document. In case of modern digital signatures, the document authentication has a mathematical nature. The verification of the authenticity becomes the verification of a key instead of a human. Since 1994 it has been known that classical digital signature algorithms may not be safe because of the Shor’s factorization algorithm. To implement the modern authenticity protection of medical data, some new types of algorithms should be used. One of the groups of such algorithms is based on the quantum computations. In this paper, the analysis of the current knowledge status of Quantum Digital Signature protocols, with its basic principles, phases and common elements such as transmission, comparison and encryption, was outlined. Some of the most promising protocols for signing digital medical documentation, that fulfill the requirements for QDS, were also briefly described. We showed that, a QDS protocol with QKD components requires the equipment similar to the equipment used for a QKD, for its implementation, which is already commercially available. If it is properly implemented, it provides the shortest lifetime of qubits in comparison to other protocols. It can be used not only to sign classical messages but probably it could be well adopted to implement unconditionally safe protection of medical documentation in the nearest future, as well.

  6. Independent Verification Survey Report For Zone 1 Of The East Tennessee Technology Park In Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    King, David A.

    2012-01-01

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs)

  7. Initial Semantics for Strengthened Signatures

    Directory of Open Access Journals (Sweden)

    André Hirschowitz

    2012-02-01

    Full Text Available We give a new general definition of arity, yielding the companion notions of signature and associated syntax. This setting is modular in the sense requested by Ghani and Uustalu: merging two extensions of syntax corresponds to building an amalgamated sum. These signatures are too general in the sense that we are not able to prove the existence of an associated syntax in this general context. So we have to select arities and signatures for which there exists the desired initial monad. For this, we follow a track opened by Matthes and Uustalu: we introduce a notion of strengthened arity and prove that the corresponding signatures have initial semantics (i.e. associated syntax. Our strengthened arities admit colimits, which allows the treatment of the λ-calculus with explicit substitution.

  8. Magnetic Signature Analysis & Validation System

    National Research Council Canada - National Science Library

    Vliet, Scott

    2001-01-01

    The Magnetic Signature Analysis and Validation (MAGSAV) System is a mobile platform that is used to measure, record, and analyze the perturbations to the earth's ambient magnetic field caused by object such as armored vehicles...

  9. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  10. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  11. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  12. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  13. 21 CFR 11.50 - Signature manifestations.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature manifestations. 11.50 Section 11.50 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.50 Signature manifestations. (a) Signed electronic...: (1) The printed name of the signer; (2) The date and time when the signature was executed; and (3...

  14. 76 FR 30542 - Adult Signature Services

    Science.gov (United States)

    2011-05-26

    ... POSTAL SERVICE 39 CFR Part 111 Adult Signature Services AGENCY: Postal Service\\TM\\. ACTION: Final..., Domestic Mail Manual (DMM[supreg]) 503.8, to add a new extra service called Adult Signature. This new service has two available options: Adult Signature Required and Adult Signature Restricted Delivery. DATES...

  15. 1 CFR 18.7 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Signature. 18.7 Section 18.7 General Provisions... PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.7 Signature. The original and each duplicate original... stamped beneath the signature. Initialed or impressed signatures will not be accepted. Documents submitted...

  16. Attribute-Based Digital Signature System

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, Muhammad; Petkovic, M.

    2011-01-01

    An attribute-based digital signature system comprises a signature generation unit (1) for signing a message (m) by generating a signature (s) based on a user secret key (SK) associated with a set of user attributes, wherein the signature generation unit (1) is arranged for combining the user secret

  17. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    International Nuclear Information System (INIS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-01-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme. (paper)

  18. SIGNATURE: A workbench for gene expression signature analysis

    Directory of Open Access Journals (Sweden)

    Chang Jeffrey T

    2011-11-01

    Full Text Available Abstract Background The biological phenotype of a cell, such as a characteristic visual image or behavior, reflects activities derived from the expression of collections of genes. As such, an ability to measure the expression of these genes provides an opportunity to develop more precise and varied sets of phenotypes. However, to use this approach requires computational methods that are difficult to implement and apply, and thus there is a critical need for intelligent software tools that can reduce the technical burden of the analysis. Tools for gene expression analyses are unusually difficult to implement in a user-friendly way because their application requires a combination of biological data curation, statistical computational methods, and database expertise. Results We have developed SIGNATURE, a web-based resource that simplifies gene expression signature analysis by providing software, data, and protocols to perform the analysis successfully. This resource uses Bayesian methods for processing gene expression data coupled with a curated database of gene expression signatures, all carried out within a GenePattern web interface for easy use and access. Conclusions SIGNATURE is available for public use at http://genepattern.genome.duke.edu/signature/.

  19. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  20. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  1. Signature molecular descriptor : advanced applications.

    Energy Technology Data Exchange (ETDEWEB)

    Visco, Donald Patrick, Jr. (Tennessee Technological University, Cookeville, TN)

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  2. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  3. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  4. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  5. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  6. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  7. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  8. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  9. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  10. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    variable business rules, unique entity behaviors, and extensive supply-demand relations. Senior leaders’ demand for animation and interactive...simulations provide dynamic animation / visualization / run-time plots. M4 has additional automated testing facilities thanks to the clojure.test library...marathon.vnv Clojure scripts) and software-assisted interactive forensic analysis (via Excel workbooks and the Clojure Read -Evaluate-Print Loop [REPL]) to

  11. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  12. Five Guidelines for Selecting Hydrological Signatures

    Science.gov (United States)

    McMillan, H. K.; Westerberg, I.; Branger, F.

    2017-12-01

    Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to

  13. Persistence of social signatures in human communication.

    Science.gov (United States)

    Saramäki, Jari; Leicht, E A; López, Eduardo; Roberts, Sam G B; Reed-Tsochas, Felix; Dunbar, Robin I M

    2014-01-21

    The social network maintained by a focal individual, or ego, is intrinsically dynamic and typically exhibits some turnover in membership over time as personal circumstances change. However, the consequences of such changes on the distribution of an ego's network ties are not well understood. Here we use a unique 18-mo dataset that combines mobile phone calls and survey data to track changes in the ego networks and communication patterns of students making the transition from school to university or work. Our analysis reveals that individuals display a distinctive and robust social signature, captured by how interactions are distributed across different alters. Notably, for a given ego, these social signatures tend to persist over time, despite considerable turnover in the identity of alters in the ego network. Thus, as new network members are added, some old network members either are replaced or receive fewer calls, preserving the overall distribution of calls across network members. This is likely to reflect the consequences of finite resources such as the time available for communication, the cognitive and emotional effort required to sustain close relationships, and the ability to make emotional investments.

  14. Corticosteroid receptors adopt distinct cyclical transcriptional signatures.

    Science.gov (United States)

    Le Billan, Florian; Amazit, Larbi; Bleakley, Kevin; Xue, Qiong-Yao; Pussard, Eric; Lhadj, Christophe; Kolkhof, Peter; Viengchareun, Say; Fagart, Jérôme; Lombès, Marc

    2018-05-07

    Mineralocorticoid receptors (MRs) and glucocorticoid receptors (GRs) are two closely related hormone-activated transcription factors that regulate major pathophysiologic functions. High homology between these receptors accounts for the crossbinding of their corresponding ligands, MR being activated by both aldosterone and cortisol and GR essentially activated by cortisol. Their coexpression and ability to bind similar DNA motifs highlight the need to investigate their respective contributions to overall corticosteroid signaling. Here, we decipher the transcriptional regulatory mechanisms that underlie selective effects of MRs and GRs on shared genomic targets in a human renal cellular model. Kinetic, serial, and sequential chromatin immunoprecipitation approaches were performed on the period circadian protein 1 ( PER1) target gene, providing evidence that both receptors dynamically and cyclically interact at the same target promoter in a specific and distinct transcriptional signature. During this process, both receptors regulate PER1 gene by binding as homo- or heterodimers to the same promoter region. Our results suggest a novel level of MR-GR target gene regulation, which should be considered for a better and integrated understanding of corticosteroid-related pathophysiology.-Le Billan, F., Amazit, L., Bleakley, K., Xue, Q.-Y., Pussard, E., Lhadj, C., Kolkhof, P., Viengchareun, S., Fagart, J., Lombès, M. Corticosteroid receptors adopt distinct cyclical transcriptional signatures.

  15. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  16. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  17. Digital Signature Schemes with Complementary Functionality and Applications

    OpenAIRE

    S. N. Kyazhin

    2012-01-01

    Digital signature schemes with additional functionality (an undeniable signature, a signature of the designated confirmee, a signature blind, a group signature, a signature of the additional protection) and examples of their application are considered. These schemes are more practical, effective and useful than schemes of ordinary digital signature.

  18. Hybrid Control and Verification of a Pulsed Welding Process

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Larsen, Jesper Abildgaard; Izadi-Zamanabadi, Roozbeh

    Currently systems, which are desired to control, are becoming more and more complex and classical control theory objectives, such as stability or sensitivity, are often not sufficient to cover the control objectives of the systems. In this paper it is shown how the dynamics of a pulsed welding...... process can be reformulated into a timed automaton hybrid setting and subsequently properties such as reachability and deadlock absence is verified by the simulation and verification tool UPPAAL....

  19. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  20. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  1. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  2. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  3. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  4. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  5. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  6. Characterizing Resident Space Object Earthshine Signature Variability

    Science.gov (United States)

    Van Cor, Jared D.

    There are three major sources of illumination on objects in the near Earth space environment: Sunshine, Moonshine, and Earthshine. For objects in this environment (satellites, orbital debris, etc.) known as Resident Space Objects (RSOs), the sun and the moon have consistently small illuminating solid angles and can be treated as point sources; this makes their incident illumination easily modeled. The Earth on the other hand has a large illuminating solid angle, is heterogeneous, and is in a constant state of change. The objective of this thesis was to characterize the impact and variability of observed RSO Earthshine on apparent magnitude signatures in the visible optical spectral region. A key component of this research was creating Earth object models incorporating the reflectance properties of the Earth. Two Earth objects were created: a homogeneous diffuse Earth object and a time sensitive heterogeneous Earth object. The homogeneous diffuse Earth object has a reflectance equal to the average global albedo, a standard model used when modeling Earthshine. The time sensitive heterogeneous Earth object was created with two material maps representative of the dynamic reflectance of the surface of the earth, and a shell representative of the atmosphere. NASA's Moderate-resolution Imaging Spectroradiometer (MODIS) Earth observing satellite product libraries, MCD43C1 global surface BRDF map and MOD06 global fractional cloud map, were utilized to create the material maps, and a hybridized version of the Empirical Line Method (ELM) was used to create the atmosphere. This dynamic Earth object was validated by comparing simulated color imagery of the Earth to that taken by: NASAs Earth Polychromatic Imaging Camera (EPIC) located on the Deep Space Climate Observatory (DSCOVR), and by MODIS located on the Terra satellite. The time sensitive heterogeneous Earth object deviated from MODIS imagery by a spectral radiance root mean square error (RMSE) of +/-14.86 [watts/m. 2sr

  7. Signature Pedagogy in Theatre Arts

    Science.gov (United States)

    Kornetsky, Lisa

    2017-01-01

    Critique in undergraduate theatre programs is at the heart of training actors at all levels. It is accepted as the signature pedagogy and is practiced in multiple ways. This essay defines critique and presents the case for why it is used as the single most important way that performers come to understand the language, values, and discourse of the…

  8. Motif signatures of transcribed enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios

    2017-09-14

    In mammalian cells, transcribed enhancers (TrEn) play important roles in the initiation of gene expression and maintenance of gene expression levels in spatiotemporal manner. One of the most challenging questions in biology today is how the genomic characteristics of enhancers relate to enhancer activities. This is particularly critical, as several recent studies have linked enhancer sequence motifs to specific functional roles. To date, only a limited number of enhancer sequence characteristics have been investigated, leaving space for exploring the enhancers genomic code in a more systematic way. To address this problem, we developed a novel computational method, TELS, aimed at identifying predictive cell type/tissue specific motif signatures. We used TELS to compile a comprehensive catalog of motif signatures for all known TrEn identified by the FANTOM5 consortium across 112 human primary cells and tissues. Our results confirm that distinct cell type/tissue specific motif signatures characterize TrEn. These signatures allow discriminating successfully a) TrEn from random controls, proxy of non-enhancer activity, and b) cell type/tissue specific TrEn from enhancers expressed and transcribed in different cell types/tissues. TELS codes and datasets are publicly available at http://www.cbrc.kaust.edu.sa/TELS.

  9. Quark-Gluon Plasma Signatures

    CERN Document Server

    Vogt, Ramona

    1998-01-01

    Aspects of quark-gluon plasma signatures that can be measured by CMS are discussed. First the initial conditions of the system from minijet production are introduced, including shadowing effects. Color screening of the Upsilon family is then presented, followed by energy loss effects on charm and bottom hadrons, high Pt jets and global observables.

  10. Galaxy interactions : The HI signature

    NARCIS (Netherlands)

    Sancisi, R; Barnes, JE; Sanders, DB

    1999-01-01

    HI observations are an excellent tool for investigating tidal interactions. Ongoing major and minor interactions which can lead to traumatic mergers or to accretion and the triggering of star formation, show distinct HI signatures. Interactions and mergers in the recent past can also be recognized

  11. STAR-CCM+ Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methods (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.

  12. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  13. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  14. Distinguishing signatures of determinism and stochasticity in spiking complex systems

    Science.gov (United States)

    Aragoneses, Andrés; Rubido, Nicolás; Tiana-Alsina, Jordi; Torrent, M. C.; Masoller, Cristina

    2013-01-01

    We describe a method to infer signatures of determinism and stochasticity in the sequence of apparently random intensity dropouts emitted by a semiconductor laser with optical feedback. The method uses ordinal time-series analysis to classify experimental data of inter-dropout-intervals (IDIs) in two categories that display statistically significant different features. Despite the apparent randomness of the dropout events, one IDI category is consistent with waiting times in a resting state until noise triggers a dropout, and the other is consistent with dropouts occurring during the return to the resting state, which have a clear deterministic component. The method we describe can be a powerful tool for inferring signatures of determinism in the dynamics of complex systems in noisy environments, at an event-level description of their dynamics.

  15. Unsupervised signature extraction from forensic logs

    NARCIS (Netherlands)

    Thaler, S.M.; Menkovski, V.; Petkovic, M.; Altun, Y.; Das, K.; Mielikäinen, T.; Malerba, D.; Stefanowski, J.; Read, J.; Žitnik, M.; Ceci, M.

    2017-01-01

    Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

  16. 7 CFR 718.9 - Signature requirements.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Signature requirements. 718.9 Section 718.9... MULTIPLE PROGRAMS General Provisions § 718.9 Signature requirements. (a) When a program authorized by this chapter or Chapter XIV of this title requires the signature of a producer; landowner; landlord; or tenant...

  17. 42 CFR 424.36 - Signature requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Signature requirements. 424.36 Section 424.36... (CONTINUED) MEDICARE PROGRAM CONDITIONS FOR MEDICARE PAYMENT Claims for Payment § 424.36 Signature requirements. (a) General rule. The beneficiary's own signature is required on the claim unless the beneficiary...

  18. 17 CFR 12.12 - Signature.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Signature. 12.12 Section 12.12... General Information and Preliminary Consideration of Pleadings § 12.12 Signature. (a) By whom. All... document on behalf of another person. (b) Effect. The signature on any document of any person acting either...

  19. 25 CFR 213.10 - Lessor's signature.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Lessor's signature. 213.10 Section 213.10 Indians BUREAU... MEMBERS OF FIVE CIVILIZED TRIBES, OKLAHOMA, FOR MINING How to Acquire Leases § 213.10 Lessor's signature... thumbprint which shall be designated as “right” or “left” thumbmark. Such signatures must be witnessed by two...

  20. Signature effects in 2-qp rotational bands

    International Nuclear Information System (INIS)

    Jain, A.K.; Goel, A.

    1992-01-01

    The authors briefly review the progress in understanding the 2-qp rotational bands in odd-odd nuclei. Signature effects and the phenomenon of signature inversion are discussed. The Coriolis coupling appears to have all the ingredients to explain the inversion. Some recent work on signature dependence in 2-qp bands of even-even nuclei is also discussed; interesting features are pointed out

  1. 27 CFR 17.6 - Signature authority.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Signature authority. 17.6... PRODUCTS General Provisions § 17.6 Signature authority. No claim, bond, tax return, or other required... other proper notification of signature authority has been filed with the TTB office where the required...

  2. High-speed high-security signatures

    NARCIS (Netherlands)

    Bernstein, D.J.; Duif, N.; Lange, T.; Schwabe, P.; Yang, B.Y.

    2011-01-01

    This paper shows that a $390 mass-market quad-core 2.4GHz Intel Westmere (Xeon E5620) CPU can create 108000 signatures per second and verify 71000 signatures per second on an elliptic curve at a 2128 security level. Public keys are 32 bytes, and signatures are 64 bytes. These performance figures

  3. Design and Implementation of a Mobile Voting System Using a Novel Oblivious and Proxy Signature

    Directory of Open Access Journals (Sweden)

    Shin-Yan Chiou

    2017-01-01

    Full Text Available Electronic voting systems can make the voting process much more convenient. However, in such systems, if a server signs blank votes before users vote, it may cause undue multivoting. Furthermore, if users vote before the signing of the server, voting information will be leaked to the server and may be compromised. Blind signatures could be used to prevent leaking voting information from the server; however, malicious users could produce noncandidate signatures for illegal usage at that time or in the future. To overcome these problems, this paper proposes a novel oblivious signature scheme with a proxy signature function to satisfy security requirements such as information protection, personal privacy, and message verification and to ensure that no one can cheat other users (including the server. We propose an electronic voting system based on the proposed oblivious and proxy signature scheme and implement this scheme in a smartphone application to allow users to vote securely and conveniently. Security analyses and performance comparisons are provided to show the capability and efficiency of the proposed scheme.

  4. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  5. Dynamics

    CERN Document Server

    Goodman, Lawrence E

    2001-01-01

    Beginning text presents complete theoretical treatment of mechanical model systems and deals with technological applications. Topics include introduction to calculus of vectors, particle motion, dynamics of particle systems and plane rigid bodies, technical applications in plane motions, theory of mechanical vibrations, and more. Exercises and answers appear in each chapter.

  6. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  7. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  8. Nonlinear control of magnetic signatures

    Science.gov (United States)

    Niemoczynski, Bogdan

    Magnetic properties of ferrite structures are known to cause fluctuations in Earth's magnetic field around the object. These fluctuations are known as the object's magnetic signature and are unique based on the object's geometry and material. It is a common practice to neutralize magnetic signatures periodically after certain time intervals, however there is a growing interest to develop real time degaussing systems for various applications. Development of real time degaussing system is a challenging problem because of magnetic hysteresis and difficulties in measurement or estimation of near-field flux data. The goal of this research is to develop a real time feedback control system that can be used to minimize magnetic signatures for ferrite structures. Experimental work on controlling the magnetic signature of a cylindrical steel shell structure with a magnetic disturbance provided evidence that the control process substantially increased the interior magnetic flux. This means near field estimation using interior sensor data is likely to be inaccurate. Follow up numerical work for rectangular and cylindrical cross sections investigated variations in shell wall flux density under a variety of ambient excitation and applied disturbances. Results showed magnetic disturbances could corrupt interior sensor data and magnetic shielding due to the shell walls makes the interior very sensitive to noise. The magnetic flux inside the shell wall showed little variation due to inner disturbances and its high base value makes it less susceptible to noise. This research proceeds to describe a nonlinear controller to use the shell wall data as an input. A nonlinear plant model of magnetics is developed using a constant tau to represent domain rotation lag and a gain function k to describe the magnetic hysteresis curve for the shell wall. The model is justified by producing hysteresis curves for multiple materials, matching experimental data using a particle swarm algorithm, and

  9. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  10. Spectral signature selection for mapping unvegetated soils

    Science.gov (United States)

    May, G. A.; Petersen, G. W.

    1975-01-01

    Airborne multispectral scanner data covering the wavelength interval from 0.40-2.60 microns were collected at an altitude of 1000 m above the terrain in southeastern Pennsylvania. Uniform training areas were selected within three sites from this flightline. Soil samples were collected from each site and a procedure developed to allow assignment of scan line and element number from the multispectral scanner data to each sampling location. These soil samples were analyzed on a spectrophotometer and laboratory spectral signatures were derived. After correcting for solar radiation and atmospheric attenuation, the laboratory signatures were compared to the spectral signatures derived from these same soils using multispectral scanner data. Both signatures were used in supervised and unsupervised classification routines. Computer-generated maps using the laboratory and multispectral scanner derived signatures resulted in maps that were similar to maps resulting from field surveys. Approximately 90% agreement was obtained between classification maps produced using multispectral scanner derived signatures and laboratory derived signatures.

  11. A New Adaptive Structural Signature for Symbol Recognition by Using a Galois Lattice as a Classifier.

    Science.gov (United States)

    Coustaty, M; Bertet, K; Visani, M; Ogier, J

    2011-08-01

    In this paper, we propose a new approach for symbol recognition using structural signatures and a Galois lattice as a classifier. The structural signatures are based on topological graphs computed from segments which are extracted from the symbol images by using an adapted Hough transform. These structural signatures-that can be seen as dynamic paths which carry high-level information-are robust toward various transformations. They are classified by using a Galois lattice as a classifier. The performance of the proposed approach is evaluated based on the GREC'03 symbol database, and the experimental results we obtain are encouraging.

  12. Cosmological transitions with changes in the signature of the metric

    International Nuclear Information System (INIS)

    Sakharov, A.D.

    1984-01-01

    It is conjectured that there exist states of the physical continuum which include regions with different signatures of the metric and that the observed Universe and an infinite number of other Universes arose as a result of quantum transitions with a change in the signature of the metric. The Lagrangian in such a theory must satisfy conditions of non-negativity in the regions with even signature. Signature here means the number of time coordinates. The induced gravitational Lagrangian in a conformally invariant theory of Kaluza-Klein type evidently satisfies this requirement and leads to effective equations of the gravitational theory of macroscopic space identical to the equations of the general theory of relativity. It is suggested that in our Universe there exist in addition to the observable (macroscopic) time dimension two or some other even number of compactified time dimensions. It is suggested that the formation of a Euclidean region in the center of a black hole or in the cosmological contraction of the Universe (if it is predetermined by the dynamics) is a possible outcome of gravitational collapse

  13. Delay signatures in the chaotic intensity output of a quantum dot ...

    Indian Academy of Sciences (India)

    journal of. May 2016 physics pp. 1021–1030. Delay signatures in the chaotic intensity output ... Research in complex systems require quantitative predictions of their dynamics, even ... used methods for estimating delay in complex dynamics are autocorrelation function ..... Authors thank BRNS for its financial support.

  14. Infrared signatures for remote sensing

    International Nuclear Information System (INIS)

    McDowell, R.S.; Sharpe, S.W.; Kelly, J.F.

    1994-04-01

    PNL's capabilities for infrared and near-infrared spectroscopy include tunable-diode-laser (TDL) systems covering 300--3,000 cm -1 at 2 laser. PNL also has a beam expansion source with a 12-cm slit, which provides a 3-m effective path for gases at ∼10 K, giving a Doppler width of typically 10 MHz; and long-path static gas cells (to 100 m). In applying this equipment to signatures work, the authors emphasize the importance of high spectral resolution for detecting and identifying atmospheric interferences; for identifying the optimum analytical frequencies; for deriving, by spectroscopic analysis, the molecular parameters needed for modeling; and for obtaining data on species and/or bands that are not in existing databases. As an example of such spectroscopy, the authors have assigned and analyzed the C-Cl stretching region of CCl 4 at 770--800 cm -1 . This is an important potential signature species whose IR absorption has remained puzzling because of the natural isotopic mix, extensive hot-band structure, and a Fermi resonance involving a nearby combination band. Instrument development projects include the IR sniffer, a small high-sensitivity, high-discrimination (Doppler-limited) device for fence-line or downwind monitoring that is effective even in regions of atmospheric absorption; preliminary work has achieved sensitivities at the low-ppb level. Other work covers trace species detection with TDLs, and FM-modulated CO 2 laser LIDAR. The authors are planning a field experiment to interrogate the Hanford tank farm for signature species from Rattlesnake Mountain, a standoff of ca. 15 km, to be accompanied by simultaneous ground-truthing at the tanks

  15. SIMMER-III code-verification. Phase 1

    International Nuclear Information System (INIS)

    Maschek, W.

    1996-05-01

    SIMMER-III is a computer code to investigate core disruptive accidents in liquid metal fast reactors but should also be used to investigate safety related problems in other types of advanced reactors. The code is developed by PNC with cooperation of the European partners FZK, CEA and AEA-T. SIMMER-III is a two-dimensional, three-velocity-field, multiphase, multicomponent, Eulerian, fluid-dynamics code coupled with a space-, time-, and energy-dependent neutron dynamics model. In order to model complex flow situations in a postulated disrupting core, mass and energy conservation equations are solved for 27 density components and 16 energy components, respectively. Three velocity fields (two liquid and one vapor) are modeled to simulate the relative motion of different fluid components. An additional static field takes into account the structures available in a reactor (pins, hexans, vessel structures, internal structures etc.). The neutronics is based on the discrete ordinate method (S N method) coupled into a quasistatic dynamic model. The code assessment and verification of the fluid dynamic/thermohydraulic parts of the code is performed in several steps in a joint effort of all partners. The results of the FZK contributions to the first assessment and verification phase is reported. (orig.) [de

  16. On-line signature verification using Gaussian Mixture Models and small-sample learning strategies

    Directory of Open Access Journals (Sweden)

    Gabriel Jaime Zapata-Zapata

    2016-01-01

    Full Text Available El artículo aborda el problema de entrenamiento de sistemas de verificación de firmas en línea cuando el número de muestras disponibles para el entrenamiento es bajo, debido a que en la mayoría de situaciones reales el número de firmas disponibles por usuario es muy limitado. El artículo evalúa nueve diferentes estrategias de clasificación basadas en modelos de mezclas de Gaussianas (GMM por sus siglas en inglés y la estrategia conocida como modelo histórico universal (UBM por sus siglas en inglés, la cual está diseñada con el objetivo de trabajar bajo condiciones de menor número de muestras. Las estrategias de aprendizaje de los GMM incluyen el algoritmo convencional de Esperanza y Maximización, y una aproximación Bayesiana basada en aprendizaje variacional. Las firmas son caracterizadas principalmente en términos de velocidades y aceleraciones de los patrones de escritura a mano de los usuarios. Los resultados muestran que cuando se evalúa el sistema en una configuración genuino vs. impostor, el método GMM-UBM es capaz de mantener una precisión por encima del 93%, incluso en casos en los que únicamente se usa para entrenamiento el 20% de las muestras disponibles (equivalente a 5 firmas, mientras que la combinación de un modelo Bayesiano UBM con una Máquina de Soporte Vectorial (SVM por sus siglas en inglés, modelo conocido como GMM-Supervector, logra un 99% de acierto cuando las muestras de entrenamiento exceden las 20. Por otro lado, cuando se simula un ambiente real en el que no están disponibles muestras impostoras y se usa

  17. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  18. Magnetotail processes and their ionospheric signatures

    Science.gov (United States)

    Ferdousi, B.; Raeder, J.; Zesta, E.; Murphy, K. R.; Cramer, W. D.

    2017-12-01

    In-situ observations in the magnetotail are sparse and limited to single point measurements. In the ionosphere, on the other hand, there is a broad range of observations, including magnetometers, auroral imagers, and various radars. Since the ionosphere is to some extent a mirror of plasmasheet processes it can be used as a monitor of magnetotail dynamics. Thus, it is of great importance to understand the coupling between the ionosphere and the magnetosphere in order to properly interpret the ionosphere and ground observations in terms of magnetotail dynamics. For this purpose, the global magnetohydrodynamic model OpenGGCM is used to investigate magnetosphere-ionosphere coupling. One of the key processes in magnetotail dynamics are bursty bulk flows (BBFs) which are the major means by which momentum and energy get transferred through the magnetotail and down to the ionosphere. BBFs often manifested in the ionosphere as auroral streamers. This study focuses on mapping such flow bursts from the magnetotail to the ionosphere along the magnetic field lines for three states of the magnetotail: pre-substorm onset through substorm expansion and during steady magnetospheric convection (SMC) following the substorm. We find that the orientation of streamers in the ionosphere differes for different local times, and that, for both tail and ionospheric signatures, activity increases during the SCM configutation compared to the pre-onset and quiet times. We also find that the background convection in the tail impacts the direction and deflection of the BBFs and the subsequent orientation of the auroral streamers in the ionosphere.

  19. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  20. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  1. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  2. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  3. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  4. Physical description of nuclear materials identification system (NMIS) signatures

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Mullens, J.A.; Mattingly, J.K.; Valentine, T.E.

    2000-01-01

    This paper describes all time and frequency analysis parameters measured with a new correlation processor (capability up to 1 GHz sampling rates and up to five input data channels) for three input channels: (1) the 252 Cf source ionization chamber; (2) a detection channel; and (3) a second detection channel. An intuitive and physical description of the various measured quantities is given as well as a brief mathematical description and a brief description of how the data are acquired. If the full five-channel capability is used, the number of measured quantities increases in number but not in type. The parameters provided by this new processor can be divided into two general classes: time analysis signatures and their related frequency analysis signatures. The time analysis signatures include the number of time m pulses occurs in a time interval, that is triggered randomly, upon a detection event, or upon a source fission event triggered. From the number of pulses in a time interval, the moments, factorial moments, and Feynmann variance can be obtained. Recent implementations of third- and fourth-order time and frequency analysis signatures in this processor are also briefly described. Thus, this processor used with a timed source of input neutrons contains all of the information from a pulsed neutron measurement, one and two detector Rossi-α measurements, multiplicity measurements, and third- and fourth-order correlation functions. This processor, although originally designed for active measurements with a 252 Cf interrogating source, has been successfully used passively (without 252 Cf source) for systems with inherent neutron sources such as fissile systems of plutonium. Data from active measurements with an 18.75 kg highly enriched uranium (93.2 wt%, 235 U) metal casting for storage are presented to illustrate some of the various time and frequency analysis parameters. This processor, which is a five-channel time correlation analyzer with time channel widths

  5. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  6. Keystroke Dynamics in the pre-Touchscreen Era

    Directory of Open Access Journals (Sweden)

    Nasir eAhmad

    2013-12-01

    Full Text Available Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realised via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable, and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilise multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view towards indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  7. Keystroke dynamics in the pre-touchscreen era.

    Science.gov (United States)

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A

    2013-12-19

    Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  8. Keystroke dynamics in the pre-touchscreen era

    Science.gov (United States)

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.

    2013-01-01

    Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568

  9. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  10. Institute of Geophysics, Planetary Physics, and Signatures

    Data.gov (United States)

    Federal Laboratory Consortium — The Institute of Geophysics, Planetary Physics, and Signatures at Los Alamos National Laboratory is committed to promoting and supporting high quality, cutting-edge...

  11. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  12. Modeling the Thermal Signature of Natural Backgrounds

    National Research Council Canada - National Science Library

    Gamborg, Marius

    2002-01-01

    Two measuring stations have been established the purpose being to collect comprehensive databases of thermal signatures of background elements in addition to the prevailing meteorological conditions...

  13. An Arbitrated Quantum Signature Scheme without Entanglement*

    International Nuclear Information System (INIS)

    Li Hui-Ran; Luo Ming-Xing; Peng Dai-Yuan; Wang Xiao-Jun

    2017-01-01

    Several quantum signature schemes are recently proposed to realize secure signatures of quantum or classical messages. Arbitrated quantum signature as one nontrivial scheme has attracted great interests because of its usefulness and efficiency. Unfortunately, previous schemes cannot against Trojan horse attack and DoS attack and lack of the unforgeability and the non-repudiation. In this paper, we propose an improved arbitrated quantum signature to address these secure issues with the honesty arbitrator. Our scheme takes use of qubit states not entanglements. More importantly, the qubit scheme can achieve the unforgeability and the non-repudiation. Our scheme is also secure for other known quantum attacks . (paper)

  14. Design And Implementation of Low Area/Power Elliptic Curve Digital Signature Hardware Core

    Directory of Open Access Journals (Sweden)

    Anissa Sghaier

    2017-06-01

    Full Text Available The Elliptic Curve Digital Signature Algorithm(ECDSA is the analog to the Digital Signature Algorithm(DSA. Based on the elliptic curve, which uses a small key compared to the others public-key algorithms, ECDSA is the most suitable scheme for environments where processor power and storage are limited. This paper focuses on the hardware implementation of the ECDSA over elliptic curveswith the 163-bit key length recommended by the NIST (National Institute of Standards and Technology. It offers two services: signature generation and signature verification. The proposed processor integrates an ECC IP, a Secure Hash Standard 2 IP (SHA-2 Ip and Random Number Generator IP (RNG IP. Thus, all IPs will be optimized, and different types of RNG will be implemented in order to choose the most appropriate one. A co-simulation was done to verify the ECDSA processor using MATLAB Software. All modules were implemented on a Xilinx Virtex 5 ML 50 FPGA platform; they require respectively 9670 slices, 2530 slices and 18,504 slices. FPGA implementations represent generally the first step for obtaining faster ASIC implementations. Further, the proposed design was also implemented on an ASIC CMOS 45-nm technology; it requires a 0.257 mm2 area cell achieving a maximum frequency of 532 MHz and consumes 63.444 (mW. Furthermore, in this paper, we analyze the security of our proposed ECDSA processor against the no correctness check for input points and restart attacks.

  15. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  17. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  18. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  19. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  20. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  1. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  2. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  3. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  4. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  5. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  6. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  7. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  8. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  9. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  10. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  11. Detection of Damage in Operating Wind Turbines by Signature Distances

    Directory of Open Access Journals (Sweden)

    James F. Manwell

    2013-01-01

    Full Text Available Wind turbines operate in the atmospheric boundary layer and are subject to complex random loading. This precludes using a deterministic response of healthy turbines as the baseline for identifying the effect of damage on the measured response of operating turbines. In the absence of such a deterministic response, the stochastic dynamic response of the tower to a shutdown maneuver is found to be affected distinctively by damage in contrast to wind. Such a dynamic response, however, cannot be established for the blades. As an alternative, the estimate of blade damage is sought through its effect on the third or fourth modal frequency, each found to be mostly unaffected by wind. To discern the effect of damage from the wind effect on these responses, a unified method of damage detection is introduced that accommodates different responses. In this method, the dynamic responses are transformed to surfaces via continuous wavelet transforms to accentuate the effect of wind or damage on the dynamic response. Regions of significant deviations between these surfaces are then isolated in their corresponding planes to capture the change signatures. The image distances between these change signatures are shown to produce consistent estimates of damage for both the tower and the blades in presence of varying wind field profiles.

  12. Abstraction of Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer

    2011-01-01

    To enable formal verification of a dynamical system, given by a set of differential equations, it is abstracted by a finite state model. This allows for application of methods for model checking. Consequently, it opens the possibility of carrying out the verification of reachability and timing re...

  13. Verification and validation methodology of training simulators

    International Nuclear Information System (INIS)

    Hassan, M.W.; Khan, N.M.; Ali, S.; Jafri, M.N.

    1997-01-01

    A full scope training simulator comprising of 109 plant systems of a 300 MWe PWR plant contracted by Pakistan Atomic Energy Commission (PAEC) from China is near completion. The simulator has its distinction in the sense that it will be ready prior to fuel loading. The models for the full scope training simulator have been developed under APROS (Advanced PROcess Simulator) environment developed by the Technical Research Center (VTT) and Imatran Voima (IVO) of Finland. The replicated control room of the plant is contracted from Shanghai Nuclear Engineering Research and Design Institute (SNERDI), China. The development of simulation models to represent all the systems of the target plant that contribute to plant dynamics and are essential for operator training has been indigenously carried out at PAEC. This multifunctional simulator is at present under extensive testing and will be interfaced with the control planes in March 1998 so as to realize a full scope training simulator. The validation of the simulator is a joint venture between PAEC and SNERDI. For the individual components and the individual plant systems, the results have been compared against design data and PSAR results to confirm the faithfulness of the simulator against the physical plant systems. The reactor physics parameters have been validated against experimental results and benchmarks generated using design codes. Verification and validation in the integrated state has been performed against the benchmark transients conducted using the RELAP5/MOD2 for the complete spectrum of anticipated transient covering the well known five different categories. (author)

  14. Automated radiotherapy treatment plan integrity verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Moore, Kevin L. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, St. Louis, Missouri 63110 (United States)

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  15. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  16. The verification basis of the ESPROSE.m code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Freeman, K.; Chen, X. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the ESPROSE.m code is presented and implemented. The approach consists of a stepwise testing procedure from wave dynamics aspects to explosion coupling at the local level, and culminates with the consideration of propagating explosive events. Each step in turn consists of an array of analytical and experimental tests. The results indicate that, given the premixture composition, the prediction of energetics of large scale explosions in multidimensional geometries is within reach. The main need identified is for constitutive laws for microinteractions with reactor materials; however, reasonably conservative assessments are presently possible. (author)

  17. Verification of the Wind Response of a Stack Structure

    Directory of Open Access Journals (Sweden)

    D. Makovička

    2003-01-01

    Full Text Available This paper deals with verification analysis of the wind response of a power plant stack structure. Over a period two weeks the actual history of the dynamic response of the structure, and the direction and intensity of the actual wind load was measured, reported and processed with the use of a computer. The resulting data was used to verify the design stage data of the structure, with the natural frequencies and modes assumed by the design and with the dominant effect of other sources on the site. In conclusion the standard requirements are compared with the actual results of measurements and their expansion to the design load.

  18. Influence of fuel composition on the spent fuel verification by Self‑Interrogation Neutron Resonance Densitometry

    International Nuclear Information System (INIS)

    Rossa, Riccardo; Borella, Alessandro; Van der Meer, Klaas; Labeau, Pierre‑Etienne; Pauly, Nicolas

    2015-01-01

    The Self‑Interrogation Neutron Resonance Densitometry (SINRD) is a passive Non‑Destructive Assay (NDA) that is developed for the safeguards verification of spent nuclear fuel. The main goal of SINRD is the direct quantification of 239Pu by estimating the SINRD signature, which is the ratio between the neutron flux in the fast energy region and in the region close to the 0.3 eV resonance of 239 Pu. The resonance region was chosen because the reduction of the neutron flux within 0.2-0.4 eV is due mainly to neutron absorption from 239 Pu, and therefore the SINRD signature can be correlated to the 239Pu mass in the fuel assembly. This work provides an estimate of the influence of 239 Pu and other nuclides on the SINRD signature. This assessment is performed by Monte Carlo simulations by introducing several nuclides in the fuel material composition and by calculating the SINRD signature for each case. The reference spent fuel library developed by SCK CEN was used for the detailed fuel compositions of PWR 17x17 fuel assemblies with different initial enrichments, burnup, and cooling times. The results from the simulations show that the SINRD signature is mainly correlated to the 239 Pu mass, with significant influence by 235 U. Moreover, the SINRD technique is largely insensitive to the cooling time of the assembly, while it is affected by the burnup and initial enrichment of the fuel. Apart from 239 Pu and 235 U, many other nuclides give minor contributions to the SINRD signature, especially at burnup higher than 20 GWd/tHM.

  19. DNA methylation signatures of educational attainment

    Science.gov (United States)

    van Dongen, Jenny; Bonder, Marc Jan; Dekkers, Koen F.; Nivard, Michel G.; van Iterson, Maarten; Willemsen, Gonneke; Beekman, Marian; van der Spek, Ashley; van Meurs, Joyce B. J.; Franke, Lude; Heijmans, Bastiaan T.; van Duijn, Cornelia M.; Slagboom, P. Eline; Boomsma, Dorret I.; BIOS consortium

    2018-03-01

    Educational attainment is a key behavioural measure in studies of cognitive and physical health, and socioeconomic status. We measured DNA methylation at 410,746 CpGs (N = 4152) and identified 58 CpGs associated with educational attainment at loci characterized by pleiotropic functions shared with neuronal, immune and developmental processes. Associations overlapped with those for smoking behaviour, but remained after accounting for smoking at many CpGs: Effect sizes were on average 28% smaller and genome-wide significant at 11 CpGs after adjusting for smoking and were 62% smaller in never smokers. We examined sources and biological implications of education-related methylation differences, demonstrating correlations with maternal prenatal folate, smoking and air pollution signatures, and associations with gene expression in cis, dynamic methylation in foetal brain, and correlations between blood and brain. Our findings show that the methylome of lower-educated people resembles that of smokers beyond effects of their own smoking behaviour and shows traces of various other exposures.

  20. 21 CFR 11.70 - Signature/record linking.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature/record linking. 11.70 Section 11.70 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective...

  1. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  2. Signatures of non-adiabatic dynamics in the fine-structure state distributions of the OH(X{sup ~}/A{sup ~}) products in the B-band photodissociation of H{sub 2}O

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Linsen [Key Laboratory of Mesoscopic Chemistry, School of Chemistry and Chemical Engineering, Institute of Theoretical and Computational Chemistry, Nanjing University, Nanjing 210093 (China); Xie, Daiqian, E-mail: dqxie@nju.edu.cn, E-mail: hguo@unm.edu [Key Laboratory of Mesoscopic Chemistry, School of Chemistry and Chemical Engineering, Institute of Theoretical and Computational Chemistry, Nanjing University, Nanjing 210093 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Guo, Hua, E-mail: dqxie@nju.edu.cn, E-mail: hguo@unm.edu [Department of Chemistry and Chemical Biology, University of New Mexico, Albuquerque, New Mexico 87131 (United States)

    2015-03-28

    A detailed quantum mechanical characterization of the photodissociation dynamics of H{sub 2}O at 121.6 nm is presented. The calculations were performed using a full-dimensional wave packet method on coupled potential energy surfaces of all relevant electronic states. Our state-to-state model permits a detailed analysis of the OH(X{sup ~}/A{sup ~}) product fine-structure populations as a probe of the non-adiabatic dissociation dynamics. The calculated rotational state distributions of the two Λ-doublet levels of OH(X{sup ~}, v = 0) exhibit very different characteristics. The A′ states, produced mostly via the B{sup ~}→X{sup ~} conical intersection pathway, have significantly higher populations than the A″ counterparts, which are primarily from the B{sup ~}→A{sup ~} Renner-Teller pathway. The former features a highly inverted and oscillatory rotational state distribution, while the latter has a smooth distribution with much less rotational excitation. In good agreement with experiment, the calculated total OH(X{sup ~}) rotational state distribution and anisotropy parameters show clear even-odd oscillations, which can be attributed to a quantum mechanical interference between waves emanating from the HOH and HHO conical intersections in the B{sup ~}→X{sup ~} non-adiabatic pathway. On the other hand, the experiment-theory agreement for the OH(A{sup ~}) fragment is also satisfactory, although some small quantitative differences suggest remaining imperfections of the ab initio based potential energy surfaces.

  3. Signatures of non-adiabatic dynamics in the fine-structure state distributions of the OH(X~/A~) products in the B-band photodissociation of H2O

    International Nuclear Information System (INIS)

    Zhou, Linsen; Xie, Daiqian; Guo, Hua

    2015-01-01

    A detailed quantum mechanical characterization of the photodissociation dynamics of H 2 O at 121.6 nm is presented. The calculations were performed using a full-dimensional wave packet method on coupled potential energy surfaces of all relevant electronic states. Our state-to-state model permits a detailed analysis of the OH(X ~ /A ~ ) product fine-structure populations as a probe of the non-adiabatic dissociation dynamics. The calculated rotational state distributions of the two Λ-doublet levels of OH(X ~ , v = 0) exhibit very different characteristics. The A′ states, produced mostly via the B ~ →X ~ conical intersection pathway, have significantly higher populations than the A″ counterparts, which are primarily from the B ~ →A ~ Renner-Teller pathway. The former features a highly inverted and oscillatory rotational state distribution, while the latter has a smooth distribution with much less rotational excitation. In good agreement with experiment, the calculated total OH(X ~ ) rotational state distribution and anisotropy parameters show clear even-odd oscillations, which can be attributed to a quantum mechanical interference between waves emanating from the HOH and HHO conical intersections in the B ~ →X ~ non-adiabatic pathway. On the other hand, the experiment-theory agreement for the OH(A ~ ) fragment is also satisfactory, although some small quantitative differences suggest remaining imperfections of the ab initio based potential energy surfaces

  4. Signatures of non-adiabatic dynamics in the fine-structure state distributions of the OH( X ˜ / A ˜ ) products in the B-band photodissociation of H2O

    Science.gov (United States)

    Zhou, Linsen; Xie, Daiqian; Guo, Hua

    2015-03-01

    A detailed quantum mechanical characterization of the photodissociation dynamics of H2O at 121.6 nm is presented. The calculations were performed using a full-dimensional wave packet method on coupled potential energy surfaces of all relevant electronic states. Our state-to-state model permits a detailed analysis of the OH( X ˜ / A ˜ ) product fine-structure populations as a probe of the non-adiabatic dissociation dynamics. The calculated rotational state distributions of the two Λ-doublet levels of OH( X ˜ , v = 0) exhibit very different characteristics. The A' states, produced mostly via the B ˜ → X ˜ conical intersection pathway, have significantly higher populations than the A″ counterparts, which are primarily from the B ˜ → A ˜ Renner-Teller pathway. The former features a highly inverted and oscillatory rotational state distribution, while the latter has a smooth distribution with much less rotational excitation. In good agreement with experiment, the calculated total OH( X ˜ ) rotational state distribution and anisotropy parameters show clear even-odd oscillations, which can be attributed to a quantum mechanical interference between waves emanating from the HOH and HHO conical intersections in the B ˜ → X ˜ non-adiabatic pathway. On the other hand, the experiment-theory agreement for the OH( A ˜ ) fragment is also satisfactory, although some small quantitative differences suggest remaining imperfections of the ab initio based potential energy surfaces.

  5. Signatures of non-adiabatic dynamics in the fine-structure state distributions of the OH(X̃/Ã) products in the B-band photodissociation of H2O.

    Science.gov (United States)

    Zhou, Linsen; Xie, Daiqian; Guo, Hua

    2015-03-28

    A detailed quantum mechanical characterization of the photodissociation dynamics of H2O at 121.6 nm is presented. The calculations were performed using a full-dimensional wave packet method on coupled potential energy surfaces of all relevant electronic states. Our state-to-state model permits a detailed analysis of the OH(X̃/Ã) product fine-structure populations as a probe of the non-adiabatic dissociation dynamics. The calculated rotational state distributions of the two Λ-doublet levels of OH(X̃, v = 0) exhibit very different characteristics. The A' states, produced mostly via the B̃→X̃ conical intersection pathway, have significantly higher populations than the A″ counterparts, which are primarily from the B̃→Ã Renner-Teller pathway. The former features a highly inverted and oscillatory rotational state distribution, while the latter has a smooth distribution with much less rotational excitation. In good agreement with experiment, the calculated total OH(X̃) rotational state distribution and anisotropy parameters show clear even-odd oscillations, which can be attributed to a quantum mechanical interference between waves emanating from the HOH and HHO conical intersections in the B̃→X̃ non-adiabatic pathway. On the other hand, the experiment-theory agreement for the OH(Ã) fragment is also satisfactory, although some small quantitative differences suggest remaining imperfections of the ab initio based potential energy surfaces.

  6. 15 CFR 908.16 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Signature. 908.16 Section 908.16 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National...

  7. 12 CFR 269b.731 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Signature. 269b.731 Section 269b.731 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM CHARGES OF UNFAIR LABOR PRACTICES General Rules § 269b.731 Signature. The original of each document filed shall be...

  8. The Pedagogic Signature of the Teaching Profession

    Science.gov (United States)

    Kiel, Ewald; Lerche, Thomas; Kollmannsberger, Markus; Oubaid, Viktor; Weiss, Sabine

    2016-01-01

    Lee S. Shulman deplores that the field of education as a profession does not have a pedagogic signature, which he characterizes as a synthesis of cognitive, practical and moral apprenticeship. In this context, the following study has three goals: 1) In the first theoretical part, the basic problems of constructing a pedagogic signature are…

  9. Infrared ship signature analysis and optimisation

    NARCIS (Netherlands)

    Neele, F.P.

    2005-01-01

    The last decade has seen an increase in the awareness of the infrared signature of naval ships. New ship designs show that infrared signature reduction measures are being incorporated, such as exhaust gas cooling systems, relocation of the exhausts and surface cooling systems. Hull and

  10. Does Social Work Have a Signature Pedagogy?

    Science.gov (United States)

    Earls Larrison, Tara; Korr, Wynne S.

    2013-01-01

    This article contributes to discourse on signature pedagogy by reconceptualizing how our pedagogies are understood and defined for social work education. We critique the view that field education is social work's signature pedagogy and consider what pedagogies are distinct about the teaching and learning of social work. Using Shulman's…

  11. 48 CFR 4.102 - Contractor's signature.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor's signature. 4.102 Section 4.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with an...

  12. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  13. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  14. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  15. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  16. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  17. Real time gamma-ray signature identifier

    Science.gov (United States)

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  18. DIGITAL SIGNATURE IN THE WAY OF LAW

    Directory of Open Access Journals (Sweden)

    Ruya Samlı

    2013-01-01

    Full Text Available Signature can be defined as a person’s name or special signs that he/she writes when he/she wants to indicate he/she wrote or confirm that writing. A person signs many times in his/her life. A person’s signature that is used for thousands of times for many things from formal documents to exams has importance for that person. Especially, signing in legal operations is an operation that can build important results. If a person’s signature is imitated by another person, he/she can become beholden, donate his/her whole wealth, commits offences or do some judicial operations. Today, because many operations can be done with digital environments and internet, signature operation that provides identity validation must also be carried to digital environment. In this paper digital signature concept that is approved for this reason and its situation in international areas and Turkish laws are investigated.

  19. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  20. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  1. Analysis of Radar Doppler Signature from Human Data

    Directory of Open Access Journals (Sweden)

    M. ANDRIĆ

    2014-04-01

    Full Text Available This paper presents the results of time (autocorrelation and time-frequency (spectrogram analyses of radar signals returned from the moving human targets. When a radar signal falls on the human target which is moving toward or away from the radar, the signals reflected from different parts of his body produce a Doppler shift that is proportional to the velocity of those parts. Moving parts of the body causes the characteristic Doppler signature. The main contribution comes from the torso which causes the central Doppler frequency of target. The motion of arms and legs induces modulation on the returned radar signal and generates sidebands around the central Doppler frequency, referred to as micro-Doppler signatures. Through analyses on experimental data it was demonstrated that the human motion signature extraction is better using spectrogram. While the central Doppler frequency can be determined using the autocorrelation and the spectrogram, the extraction of the fundamental cadence frequency using the autocorrelation is unreliable when the target is in the clutter presence. It was shown that the fundamental cadence frequency increases with increasing dynamic movement of people and simultaneously the possibility of its extraction is proportional to the degree of synchronization movements of persons in the group.

  2. 48 CFR 804.101 - Contracting officer's signature.

    Science.gov (United States)

    2010-10-01

    ... signature. 804.101 Section 804.101 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.101 Contracting officer's signature. (a) If a... signature. ...

  3. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  4. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  5. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  6. Maximizing biomarker discovery by minimizing gene signatures

    Directory of Open Access Journals (Sweden)

    Chang Chang

    2011-12-01

    Full Text Available Abstract Background The use of gene signatures can potentially be of considerable value in the field of clinical diagnosis. However, gene signatures defined with different methods can be quite various even when applied the same disease and the same endpoint. Previous studies have shown that the correct selection of subsets of genes from microarray data is key for the accurate classification of disease phenotypes, and a number of methods have been proposed for the purpose. However, these methods refine the subsets by only considering each single feature, and they do not confirm the association between the genes identified in each gene signature and the phenotype of the disease. We proposed an innovative new method termed Minimize Feature's Size (MFS based on multiple level similarity analyses and association between the genes and disease for breast cancer endpoints by comparing classifier models generated from the second phase of MicroArray Quality Control (MAQC-II, trying to develop effective meta-analysis strategies to transform the MAQC-II signatures into a robust and reliable set of biomarker for clinical applications. Results We analyzed the similarity of the multiple gene signatures in an endpoint and between the two endpoints of breast cancer at probe and gene levels, the results indicate that disease-related genes can be preferably selected as the components of gene signature, and that the gene signatures for the two endpoints could be interchangeable. The minimized signatures were built at probe level by using MFS for each endpoint. By applying the approach, we generated a much smaller set of gene signature with the similar predictive power compared with those gene signatures from MAQC-II. Conclusions Our results indicate that gene signatures of both large and small sizes could perform equally well in clinical applications. Besides, consistency and biological significances can be detected among different gene signatures, reflecting the

  7. High-fidelity simulation of turbofan engine. ; Verification and improvement of model's dynamical characteristics in linear operating range. Turbofan engine no koseito simulation. ; Senkei sado han'i ni okeru model dotokusei no kensho to seido kojo ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Yamane, H; Kagiyama, S [Defence Agency, Tokyo (Japan)

    1993-09-25

    This paper describes providing pulse inputs to a fuel supply in trial operation of a turbofan engine, measurement of its response, and calculation of the frequency characteristics and time constants to acquire dynamic characteristics of the engine on the ground. The resultant engine characteristics were compared with the model characteristics of numerically analyzing a mathematical simulation model, and corrected to develop a high-accuracy simulation model. An element model and a dynamics model were prepared in detail on the main engine components, such as fans, a compressor, a combustor, and a turbine, along a flow diagram from the air intake opening to the exhaust nozzle. The pulses were inputted into the fuel supply by opening and closing an electromagnetic valve. Closing of the illustrated electromagnetic valve for about 0.7 second caused a difference (of phase and trend) in both characteristics of high and low frequencies as a result of pulse-like change in the flow rate. To correct the model characteristics, the combustion delay tie was set to 0.02 second upon considering the combustion delay time relative to the heat capacity of the combustor. Improvement in the model was verified as the phase characteristics was approximated to the engine characteristics. 13 refs., 17 figs., 2 tabs.

  8. From Anosov dynamics to hyperbolic attractors

    Indian Academy of Sciences (India)

    the dynamics on the attractive sets of the self-oscillatory systems and for the original Anosov geodesic flow. The hyperbolic nature ... Hyperbolic theory is a branch of the theory of dynami- ..... Figure 5. Verification of the hyperbolicity criterion for.

  9. Real-time scene and signature generation for ladar and imaging sensors

    Science.gov (United States)

    Swierkowski, Leszek; Christie, Chad L.; Antanovskii, Leonid; Gouthas, Efthimios

    2014-05-01

    This paper describes development of two key functionalities within the VIRSuite scene simulation program, broadening its scene generation capabilities and increasing accuracy of thermal signatures. Firstly, a new LADAR scene generation module has been designed. It is capable of simulating range imagery for Geiger mode LADAR, in addition to the already existing functionality for linear mode systems. Furthermore, a new 3D heat diffusion solver has been developed within the VIRSuite signature prediction module. It is capable of calculating the temperature distribution in complex three-dimensional objects for enhanced dynamic prediction of thermal signatures. With these enhancements, VIRSuite is now a robust tool for conducting dynamic simulation for missiles with multi-mode seekers.

  10. Genomic Signatures of Sexual Conflict.

    Science.gov (United States)

    Kasimatis, Katja R; Nelson, Thomas C; Phillips, Patrick C

    2017-10-30

    Sexual conflict is a specific class of intergenomic conflict that describes the reciprocal sex-specific fitness costs generated by antagonistic reproductive interactions. The potential for sexual conflict is an inherent property of having a shared genome between the sexes and, therefore, is an extreme form of an environment-dependent fitness effect. In this way, many of the predictions from environment-dependent selection can be used to formulate expected patterns of genome evolution under sexual conflict. However, the pleiotropic and transmission constraints inherent to having alleles move across sex-specific backgrounds from generation to generation further modulate the anticipated signatures of selection. We outline methods for detecting candidate sexual conflict loci both across and within populations. Additionally, we consider the ability of genome scans to identify sexually antagonistic loci by modeling allele frequency changes within males and females due to a single generation of selection. In particular, we highlight the need to integrate genotype, phenotype, and functional information to truly distinguish sexual conflict from other forms of sexual differentiation. © The American Genetic Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Transcriptomic signatures in cartilage ageing

    Science.gov (United States)

    2013-01-01

    Introduction Age is an important factor in the development of osteoarthritis. Microarray studies provide insight into cartilage aging but do not reveal the full transcriptomic phenotype of chondrocytes such as small noncoding RNAs, pseudogenes, and microRNAs. RNA-Seq is a powerful technique for the interrogation of large numbers of transcripts including nonprotein coding RNAs. The aim of the study was to characterise molecular mechanisms associated with age-related changes in gene signatures. Methods RNA for gene expression analysis using RNA-Seq and real-time PCR analysis was isolated from macroscopically normal cartilage of the metacarpophalangeal joints of eight horses; four young donors (4 years old) and four old donors (>15 years old). RNA sequence libraries were prepared following ribosomal RNA depletion and sequencing was undertaken using the Illumina HiSeq 2000 platform. Differentially expressed genes were defined using Benjamini-Hochberg false discovery rate correction with a generalised linear model likelihood ratio test (P ageing cartilage. Conclusion There was an age-related dysregulation of matrix, anabolic and catabolic cartilage factors. This study has increased our knowledge of transcriptional networks in cartilage ageing by providing a global view of the transcriptome. PMID:23971731

  12. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    Science.gov (United States)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  13. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  14. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  15. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  16. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  17. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  18. Reduction of a Ship's Magnetic Field Signatures

    CERN Document Server

    Holmes, John

    2008-01-01

    Decreasing the magnetic field signature of a naval vessel will reduce its susceptibility to detonating naval influence mines and the probability of a submarine being detected by underwater barriers and maritime patrol aircraft. Both passive and active techniques for reducing the magnetic signatures produced by a vessel's ferromagnetism, roll-induced eddy currents, corrosion-related sources, and stray fields are presented. Mathematical models of simple hull shapes are used to predict the levels of signature reduction that might be achieved through the use of alternate construction materials. Al

  19. Molecular signatures of thyroid follicular neoplasia

    DEFF Research Database (Denmark)

    Borup, R.; Rossing, M.; Henao, Ricardo

    2010-01-01

    The molecular pathways leading to thyroid follicular neoplasia are incompletely understood, and the diagnosis of follicular tumors is a clinical challenge. To provide leads to the pathogenesis and diagnosis of the tumors, we examined the global transcriptome signatures of follicular thyroid...... a mechanism for cancer progression, which is why we exploited the results in order to generate a molecular classifier that could identify 95% of all carcinomas. Validation employing public domain and cross-platform data demonstrated that the signature was robust and could diagnose follicular nodules...... and robust genetic signature for the diagnosis of FA and FC. Endocrine-Related Cancer (2010) 17 691-708...

  20. DIGITAL SIGNATURE IN THE WAY OF LAW

    OpenAIRE

    Ruya Samlı

    2013-01-01

    Signature can be defined as a person’s name or special signs that he/she writes when he/she wants to indicate he/she wrote or confirm that writing. A person signs many times in his/her life. A person’s signature that is used for thousands of times for many things from formal documents to exams has importance for that person. Especially, signing in legal operations is an operation that can build important results. If a person’s signature is imitated by another person, he/she can be...

  1. Criticality meets learning: Criticality signatures in a self-organizing recurrent neural network.

    Science.gov (United States)

    Del Papa, Bruno; Priesemann, Viola; Triesch, Jochen

    2017-01-01

    Many experiments have suggested that the brain operates close to a critical state, based on signatures of criticality such as power-law distributed neuronal avalanches. In neural network models, criticality is a dynamical state that maximizes information processing capacities, e.g. sensitivity to input, dynamical range and storage capacity, which makes it a favorable candidate state for brain function. Although models that self-organize towards a critical state have been proposed, the relation between criticality signatures and learning is still unclear. Here, we investigate signatures of criticality in a self-organizing recurrent neural network (SORN). Investigating criticality in the SORN is of particular interest because it has not been developed to show criticality. Instead, the SORN has been shown to exhibit spatio-temporal pattern learning through a combination of neural plasticity mechanisms and it reproduces a number of biological findings on neural variability and the statistics and fluctuations of synaptic efficacies. We show that, after a transient, the SORN spontaneously self-organizes into a dynamical state that shows criticality signatures comparable to those found in experiments. The plasticity mechanisms are necessary to attain that dynamical state, but not to maintain it. Furthermore, onset of external input transiently changes the slope of the avalanche distributions - matching recent experimental findings. Interestingly, the membrane noise level necessary for the occurrence of the criticality signatures reduces the model's performance in simple learning tasks. Overall, our work shows that the biologically inspired plasticity and homeostasis mechanisms responsible for the SORN's spatio-temporal learning abilities can give rise to criticality signatures in its activity when driven by random input, but these break down under the structured input of short repeating sequences.

  2. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  3. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  4. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  5. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  6. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  7. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  8. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  9. European Train Control System: A Case Study in Formal Verification

    Science.gov (United States)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  10. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  11. Verification of the ECMWF ensemble forecasts of wind speed against analyses and observations

    DEFF Research Database (Denmark)

    Pinson, Pierre; Hagedorn, Renate

    2012-01-01

    A framework for the verification of ensemble forecasts of near-surface wind speed is described. It is based on existing scores and diagnostic tools, though considering observations from synoptic stations as reference instead of the analysis. This approach is motivated by the idea of having a user......-oriented view of verification, for instance with the wind power applications in mind. The verification framework is specifically applied to the case of ECMWF ensemble forecasts and over Europe. Dynamic climatologies are derived at the various stations, serving as a benchmark. The impact of observational...... uncertainty on scores and diagnostic tools is also considered. The interest of this framework is demonstrated from its application to the routine evaluation of ensemble forecasts and to the assessment of the quality improvements brought in by the recent change in horizontal resolution of the ECMWF ensemble...

  12. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  13. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    Science.gov (United States)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  14. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  15. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  16. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  17. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  18. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  19. Magnetic Signature of Brushless Electric Motors

    National Research Council Canada - National Science Library

    Clarke, David

    2006-01-01

    Brushless electric motors are used in a number of underwater vehicles. When these underwater vehicles are used for mine clearance operations the magnetic signature of the brushless motors is important...

  20. Magnetic signature surveillance of nuclear fuel

    International Nuclear Information System (INIS)

    Bernatowicz, H.; Schoenig, F.C.

    1981-01-01

    Typical nuclear fuel material contains tramp ferromagnetic particles of random size and distribution. Also, selected amounts of paramagnetic or ferromagnetic material can be added at random or at known positions in the fuel material. The fuel material in its non-magnetic container is scanned along its length by magnetic susceptibility detecting apparatus whereby susceptibility changes along its length are obtained and provide a unique signal waveform of the container of fuel material as a signature thereof. The output signature is stored. At subsequent times in its life the container is again scanned and respective signatures obtained which are compared with the initially obtained signature, any differences indicating alteration or tampering with the fuel material. If the fuel material includes a paramagnetic additive by taking two measurements along the container the effects thereof can be cancelled out. (author)

  1. Blind Quantum Signature with Blind Quantum Computation

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  2. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  3. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  4. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  5. Signature scheme based on bilinear pairs

    Science.gov (United States)

    Tong, Rui Y.; Geng, Yong J.

    2013-03-01

    An identity-based signature scheme is proposed by using bilinear pairs technology. The scheme uses user's identity information as public key such as email address, IP address, telephone number so that it erases the cost of forming and managing public key infrastructure and avoids the problem of user private generating center generating forgery signature by using CL-PKC framework to generate user's private key.

  6. Signature for the shape of the universe

    International Nuclear Information System (INIS)

    Gomero, G.I.; Reboucas, M.J.; Teixeira, A.F.F.

    2001-03-01

    If the universe has a nontrival shape (topology) the sky may show multiple correlated images of cosmic objects. These correlations can be counched in terms of distance correlations. We propose a statistical quantity which can be used to reveal the topological signature of any Roberston-Walker (RW) spacetime with nontrivial topology. We also show through computer-aided simulations how one can extract the topological signatures of flat elliptic and hyperbolic RW universes with nontrivial topology. (author)

  7. Neutral signature Walker-VSI metrics

    International Nuclear Information System (INIS)

    Coley, A; McNutt, D; Musoke, N; Brooks, D; Hervik, S

    2014-01-01

    We will construct explicit examples of four-dimensional neutral signature Walker (but not necessarily degenerate Kundt) spaces for which all of the polynomial scalar curvature invariants vanish. We then investigate the properties of some particular subclasses of Ricci flat spaces. We also briefly describe some four-dimensional neutral signature Einstein spaces for which all of the polynomial scalar curvature invariants are constant. (paper)

  8. Tightly Secure Signatures From Lossy Identification Schemes

    OpenAIRE

    Abdalla , Michel; Fouque , Pierre-Alain; Lyubashevsky , Vadim; Tibouchi , Mehdi

    2015-01-01

    International audience; In this paper, we present three digital signature schemes with tight security reductions in the random oracle model. Our first signature scheme is a particularly efficient version of the short exponent discrete log-based scheme of Girault et al. (J Cryptol 19(4):463–487, 2006). Our scheme has a tight reduction to the decisional short discrete logarithm problem, while still maintaining the non-tight reduction to the computational version of the problem upon which the or...

  9. Wave function of the Universe, preferred reference frame effects and metric signature transition

    International Nuclear Information System (INIS)

    Ghaffarnejad, Hossein

    2015-01-01

    Gravitational model of non-minimally coupled Brans Dicke (BD) scalar field 0 with dynamical unit time-like four vector field is used to study flat Robertson Walker (RW) cosmology in the presence of variable cosmological parameter V (ϕ) = Λϕ. Aim of the paper is to seek cosmological models which exhibit metric signature transition. The problem is studied in both classical and quantum cosmological approach with large values of BD parameter ω >> 1. Scale factor of RW metric is obtained as which describes nonsingular inflationary universe in Lorentzian signature sector. Euclidean signature sector of our solution describes a re-collapsing universe and is obtained from analytic continuation of the Lorentzian sector by exchanging . Dynamical vector field together with the BD scalar field are treated as fluid with time dependent barotropic index. They have regular (dark) matter dominance in the Euclidean (Lorentzian) sector. We solved Wheeler De Witt (WD) quantum wave equation of the cosmological system. Assuming a discrete non-zero ADM mass we obtained solutions of the WD equation as simple harmonic quantum Oscillator eigen functionals described by Hermite polynomials. Absolute values of these eigen functionals have nonzero values on the hypersurface in which metric field has signature degeneracy. Our eigen functionals describe nonzero probability of the space time with Lorentzian (Euclidean) signature for . Maximal probability corresponds to the ground state j = 0. (paper)

  10. Simulation and Experimental Validation of Electromagnetic Signatures for Monitoring of Nuclear Material Storage Containers

    International Nuclear Information System (INIS)

    Aker, Pamela M.; Bunch, Kyle J.; Jones, Anthony M.

    2013-01-01

    Previous research at the Pacific Northwest National Laboratory (PNNL) has demonstrated that the low frequency electromagnetic (EM) response of a sealed metallic container interrogated with an encircling coil is a strong function of its contents and can be used to form a distinct signature which can confirm the presence of specific components without revealing hidden geometry or classified design information. Finite element simulations have recently been performed to further investigate this response for a variety of configurations composed of an encircling coil and a typical nuclear material storage container. Excellent agreement was obtained between simulated and measured impedance signatures of electrically conducting spheres placed inside an AT-400R nuclear container. Simulations were used to determine the effects of excitation frequency and the geometry of the encircling coil, nuclear container, and internal contents. The results show that it is possible to use electromagnetic models to evaluate the application of the EM signature technique to proposed versions of nuclear weapons containers which can accommodate restrictions imposed by international arms control and treaty verification legislation

  11. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  12. Analysis of Nozzle Jet Plume Effects on Sonic Boom Signature

    Science.gov (United States)

    Bui, Trong

    2010-01-01

    An axisymmetric full Navier-Stokes computational fluid dynamics (CFD) study was conducted to examine nozzle exhaust jet plume effects on the sonic boom signature of a supersonic aircraft. A simplified axisymmetric nozzle geometry, representative of the nozzle on the NASA Dryden NF-15B Lift and Nozzle Change Effects on Tail Shock (LaNCETS) research airplane, was considered. The highly underexpanded nozzle flow is found to provide significantly more reduction in the tail shock strength in the sonic boom N-wave pressure signature than perfectly expanded and overexpanded nozzle flows. A tail shock train in the sonic boom signature, similar to what was observed in the LaNCETS flight data, is observed for the highly underexpanded nozzle flow. The CFD results provide a detailed description of the nozzle flow physics involved in the LaNCETS nozzle at different nozzle expansion conditions and help in interpreting LaNCETS flight data as well as in the eventual CFD analysis of a full LaNCETS aircraft. The current study also provided important information on proper modeling of the LaNCETS aircraft nozzle. The primary objective of the current CFD research effort was to support the LaNCETS flight research data analysis effort by studying the detailed nozzle exhaust jet plume s imperfect expansion effects on the sonic boom signature of a supersonic aircraft. Figure 1 illustrates the primary flow physics present in the interaction between the exhaust jet plume shock and the sonic boom coming off of an axisymmetric body in supersonic flight. The steeper tail shock from highly expanded jet plume reduces the dip of the sonic boom N-wave signature. A structured finite-volume compressible full Navier-Stokes CFD code was used in the current study. This approach is not limited by the simplifying assumptions inherent in previous sonic boom analysis efforts. Also, this study was the first known jet plume sonic boom CFD study in which the full viscous nozzle flow field was modeled, without

  13. Human Signatures for Personnel Detection

    Science.gov (United States)

    2010-09-14

    eternal gratitude, my sister Isabel, Walter Hunter, but specially my mother whose incredible sacrifice is finally paying off, and Eileen who always...higher than the others. A high contrast image will exhibit a large variety of gray tones and great detail and also will have high dynamic range...with a gray line. By manipulating each color separately we can choose a color for each object detected. 66 currentr = round(trackedlist(currentID

  14. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  15. A Black Hole Spectral Signature

    Science.gov (United States)

    Titarchuk, Lev; Laurent, Philippe

    2000-03-01

    An accreting black hole is, by definition, characterized by the drain. Namely, the matter falls into a black hole much the same way as water disappears down a drain matter goes in and nothing comes out. As this can only happen in a black hole, it provides a way to see ``a black hole'', an unique observational signature. The accretion proceeds almost in a free-fall manner close to the black hole horizon, where the strong gravitational field dominates the pressure forces. In this paper we present analytical calculations and Monte-Carlo simulations of the specific features of X-ray spectra formed as a result of upscattering of the soft (disk) photons in the converging inflow (CI) into the black hole. The full relativistic treatment has been implemented to reproduce these spectra. We show that spectra in the soft state of black hole systems (BHS) can be described as the sum of a thermal (disk) component and the convolution of some fraction of this component with the CI upscattering spread (Greens) function. The latter boosted photon component is seen as an extended power-law at energies much higher than the characteristic energy of the soft photons. We demonstrate the stability of the power spectral index over a wide range of the plasma temperature 0 - 10 keV and mass accretion rates (higher than 2 in Eddington units). We also demonstrate that the sharp high energy cutoff occurs at energies of 200-400 keV which are related to the average energy of electrons mec2 impinging upon the event horizon. The spectrum is practically identical to the standard thermal Comptonization spectrum when the CI plasma temperature is getting of order of 50 keV (the typical ones for the hard state of BHS). In this case one can see the effect of the bulk motion only at high energies where there is an excess in the CI spectrum with respect to the pure thermal one. Furthermore we demonstrate that the change of spectral shapes from the soft X-ray state to the hard X-ray state is clearly to be

  16. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  17. Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption

    Science.gov (United States)

    Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.

    2018-04-01

    This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.

  18. Does Twitter trigger bursts in signature collections?

    Directory of Open Access Journals (Sweden)

    Rui Yamaguchi

    Full Text Available INTRODUCTION: The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. METHODS AND FINDINGS: In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78% of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26% was smaller than the Forum effect (52% in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. CONCLUSIONS: The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore

  19. Does Twitter trigger bursts in signature collections?

    Science.gov (United States)

    Yamaguchi, Rui; Imoto, Seiya; Kami, Masahiro; Watanabe, Kenji; Miyano, Satoru; Yuji, Koichiro

    2013-01-01

    The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78%) of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26%) was smaller than the Forum effect (52%) in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore information hidden in social phenomena.

  20. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime