WorldWideScience

Sample records for rewrite based verification

  1. Specification and Verification of Web Applications in Rewriting Logic

    Science.gov (United States)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  2. Monotonic Set-Extended Prefix Rewriting and Verification of Recursive Ping-Pong Protocols

    DEFF Research Database (Denmark)

    Delzanno, Giorgio; Esparza, Javier; Srba, Jiri

    2006-01-01

    of messages) some verification problems become decidable. In particular we give an algorithm to decide control state reachability, a problem related to security properties like secrecy and authenticity. The proof is via a reduction to a new prefix rewriting model called Monotonic Set-extended Prefix rewriting...

  3. A dithienylethene-based rewritable hydrogelator.

    Science.gov (United States)

    van Herpt, Jochem T; Stuart, Marc C A; Browne, Wesley R; Feringa, Ben L

    2014-03-10

    Dithienylethene photochromic switching units have been incorporated into a hydrogelating system based on a tripeptide motif. The resulting hybrid system provided both a photochromic response and the ability to gelate water under acidic and neutral conditions. Fluorescence spectroscopy shows that the dithienylethene units are in sufficient proximity to each other to stack in gel fibers, with the tripeptide unit determining solubility. TEM measurements provided insight into the microscopic structure of the fibers formed.

  4. A Rewriting-Logic-Based Technique for Modeling Thermal Systems

    Directory of Open Access Journals (Sweden)

    Daniela Lepri

    2010-09-01

    Full Text Available This paper presents a rewriting-logic-based modeling and analysis technique for physical systems, with focus on thermal systems. The contributions of this paper can be summarized as follows: (i providing a framework for modeling and executing physical systems, where both the physical components and their physical interactions are treated as first-class citizens; (ii showing how heat transfer problems in thermal systems can be modeled in Real-Time Maude; (iii giving the implementation in Real-Time Maude of a basic numerical technique for executing continuous behaviors in object-oriented hybrid systems; and (iv illustrating these techniques with a set of incremental case studies using realistic physical parameters, with examples of simulation and model checking analyses.

  5. A Modular Rewriting Semantics for CML

    DEFF Research Database (Denmark)

    Chalub, Fabricio; Braga, Christiano de Oliveira

    2004-01-01

    This paper presents a modular rewriting semantics (MRS) specification for Reppy's Concurrent ML (CML), based on Peter Mosses' modular structural operational semantics specification for CML. A modular rewriting semantics specification for a programming language is a rewrite theory in rewriting logic...... of rewriting logic, and to verify CML programs using Maude's built-in LTL model checker. It is assumed that the reader is familiar with basic concepts of structural operational semantics and algebraic specifications....

  6. Modes of convergence for term graph rewriting

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2011-01-01

    Term graph rewriting provides a simple mechanism to finitely represent restricted forms of infinitary term rewriting. The correspondence between infinitary term rewriting and term graph rewriting has been studied to some extent. However, this endeavour is impaired by the lack of an appropriate...... counterpart of infinitary rewriting on the side of term graphs. We aim to fill this gap by devising two modes of convergence based on a partial order resp. a metric on term graphs. The thus obtained structures generalise corresponding modes of convergence that are usually studied in infinitary term rewriting....... We argue that this yields a common framework in which both term rewriting and term graph rewriting can be studied. In order to substantiate our claim, we compare convergence on term graphs and on terms. In particular, we show that the resulting infinitary calculi of term graph rewriting exhibit...

  7. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  8. Rewriting Modulo SMT

    Science.gov (United States)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  9. Modes of convergence for term graph rewriting

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2012-01-01

    Term graph rewriting provides a simple mechanism to finitely represent restricted forms of infinitary term rewriting. The correspondence between infinitary term rewriting and term graph rewriting has been studied to some extent. However, this endeavour is impaired by the lack of an appropriate...... counterpart of infinitary rewriting on the side of term graphs. We aim to fill this gap by devising two modes of convergence based on a partial order respectively a metric on term graphs. The thus obtained structures generalise corresponding modes of convergence that are usually studied in infinitary term...... rewriting. We argue that this yields a common framework in which both term rewriting and term graph rewriting can be studied. In order to substantiate our claim, we compare convergence on term graphs and on terms. In particular, we show that the modes of convergence on term graphs are conservative...

  10. Optical Rewritable Electronic Paper

    Science.gov (United States)

    Muravsky, Alexander; Murauski, Anatoli; Chigrinov, Vladimir; Kwok, Hoi-Sing

    We developed new principle of electronic paper that is one side (for 2D image) or double side (for stereoscopic 3D image) light printable rewritable matter with polarization dependent gray scale. It consists of one or two liquid crystal displays based on Optical Rewritable (ORW) technology, which is the development of rotation azo-dye photoalignment. Each ORW display uses bare plastic or polarizers as substrates. The conductor is not required, as the image is formed by rewritable states of azimuthal direction, which results in 2D pattern of the liquid crystal twist angle. Continuous grey image maintains proper performance even when the device is bent. Simple construction provides durability and low cost, thin substrates minimize parallax for 3D image. Fluorescent dye dopant of liquid crystal partly absorbs light in blue and re-emit in green specter range improving photopic reflection and enhancing color of the ORW e-paper.

  11. Infinitary Rewriting - Theory and Applications

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2009-01-01

    that are used to formalise infinite reduction sequences: The well-established metric approach as well as an alternative approach using partial orders. Both methods together with the consequent infinitary versions of confluence and termination properties are analysed on an abstract level. Based on this, we argue....... Infinitary rewriting makes it possible to apply rewriting in order to obtain a formal model for such infinite derivations. The goal of this thesis is to comprehensively survey the field of infinitary term rewriting, to point out its shortcomings, and to try to overcome some of these shortcomings. The most...

  12. Multi-Context Rewriting Induction with Termination Checkers

    Science.gov (United States)

    Sato, Haruhiko; Kurihara, Masahito

    Inductive theorem proving plays an important role in the field of formal verification of systems. The rewriting induction (RI) is a method for inductive theorem proving proposed by Reddy. In order to obtain successful proofs, it is very important to choose appropriate contexts (such as in which direction each equation should be oriented) when applying RI inference rules. If the choice is not appropriate, the procedure may diverge or the users have to come up with several lemmas to prove together with the main theorem. Therefore we have a good reason to consider parallel execution of several instances of the rewriting induction procedure, each in charge of a distinguished single context in search of a successful proof. In this paper, we propose a new procedure, called multi-context rewriting induction, which efficiently simulates parallel execution of rewriting induction procedures in a single process, based on the idea of the multi-completion procedure. By the experiments with a well-known problem set, we discuss the effectiveness of the proposed procedure when searching along various contexts for a successful inductive proof.

  13. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  14. Verification-based Software-fault Detection

    OpenAIRE

    Gladisch, Christoph David

    2011-01-01

    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  15. Naphthalene based AIE active stimuli-responsive material as rewritable media for temporary communication

    Science.gov (United States)

    Pannipara, Mehboobali; Al-Sehemi, Abdullah G.; Kalam, Abul; Asiri, Abdullah M.

    2017-10-01

    Organic molecules having extended π-conjugated moieties is useful for creating 'dynamic' functional materials by modulating the photophysical properties and molecular packing through non-covalent interactions. Herein, we report the photoluminescence properties of a luminogen, NBA, exhibiting aggregation-induced emission (AIE) characteristics, synthesized by Knoevenagel condensation reaction between 2-Hydroxy naphthaldehyde and malononitrile. NBA emits strongly upon aggregation and in solid state with large Stokes shift whereas it is non emissive in pure solvents. The aggregation induced emission behavior of the compound was carried out in DMSO (good solvent)-water mixture (poor solvent) with water fraction (fw) ranging from 0% to 98%. The AIE property of the luminogen were further exploited for fabricating rewritable fluorescent paper substrates that found applications in security printing and data storage where the written images or letters stored on the filter paper are invisible under normal light.

  16. Invention, Rewriting, Usurpation

    DEFF Research Database (Denmark)

    Jacobsen, Anders-Christian; Ulrich, Jörg; Brakke, David

    2011-01-01

    Conference volume from the conference Invention, Rewriting, Usurpation. Discursive Fights over Religious Traditions in Antiquity, Aarhus / Ebeltoft May 30 - June 4, 2010.......Conference volume from the conference Invention, Rewriting, Usurpation. Discursive Fights over Religious Traditions in Antiquity, Aarhus / Ebeltoft May 30 - June 4, 2010....

  17. Privacy Preserving Iris Based Biometric Identity Verification

    Directory of Open Access Journals (Sweden)

    Przemyslaw Strzelczyk

    2011-08-01

    Full Text Available Iris biometrics is considered one of the most accurate and robust methods of identity verification. Individually unique iris features can be presented in a compact binary form easily compared with reference template to confirm identity. However, when templates or features are disclosed, iris biometrics is no longer suitable for verification. Therefore, there is a need to perform iris feature matching without revealing the features itself and reference template. The paper proposes an extension of the standard iris-based verification protocol that introduces features and a template locking mechanism, which guarantees that no sensitive information is exposed.Article in English

  18. Term Graph Rewriting and Parallel Term Rewriting

    Directory of Open Access Journals (Sweden)

    Andrea Corradini

    2011-02-01

    Full Text Available The relationship between Term Graph Rewriting and Term Rewriting is well understood: a single term graph reduction may correspond to several term reductions, due to sharing. It is also known that if term graphs are allowed to contain cycles, then one term graph reduction may correspond to infinitely many term reductions. We stress that this fact can be interpreted in two ways. According to the "sequential interpretation", a term graph reduction corresponds to an infinite sequence of term reductions, as formalized by Kennaway et.al. using strongly converging derivations over the complete metric space of infinite terms. Instead according to the "parallel interpretation" a term graph reduction corresponds to the parallel reduction of an infinite set of redexes in a rational term. We formalize the latter notion by exploiting the complete partial order of infinite and possibly partial terms, and we stress that this interpretation allows to explain the result of reducing circular redexes in several approaches to term graph rewriting.

  19. Pattern graph rewrite systems

    Directory of Open Access Journals (Sweden)

    Aleks Kissinger

    2014-03-01

    Full Text Available String diagrams are a powerful tool for reasoning about physical processes, logic circuits, tensor networks, and many other compositional structures. Dixon, Duncan and Kissinger introduced string graphs, which are a combinatoric representations of string diagrams, amenable to automated reasoning about diagrammatic theories via graph rewrite systems. In this extended abstract, we show how the power of such rewrite systems can be greatly extended by introducing pattern graphs, which provide a means of expressing infinite families of rewrite rules where certain marked subgraphs, called !-boxes ("bang boxes", on both sides of a rule can be copied any number of times or removed. After reviewing the string graph formalism, we show how string graphs can be extended to pattern graphs and how pattern graphs and pattern rewrite rules can be instantiated to concrete string graphs and rewrite rules. We then provide examples demonstrating the expressive power of pattern graphs and how they can be applied to study interacting algebraic structures that are central to categorical quantum mechanics.

  20. Term rewriting with traversal functions

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); M.G.J. van den Brand (Mark); P. Klint (Paul)

    2003-01-01

    htmlabstractTerm rewriting is an appealing technique for performing program analysis and program transformation. Tree (term) traversal is frequently used but is not supported by standard term rewriting. We extend many-sorted, first-order term rewriting with traversal functions that automate tree

  1. Verification of automata-based programs (supervised by Anatoly Shalyto)

    OpenAIRE

    Evgeny, Kurbatsky

    2008-01-01

    This paper describes a verification method of automata based programs [1] based on symbolic model checking algorithms [2]. Author makes an attempt to develop verification method that can automate process of verification and can be useful for peoples unacquainted with model checking algorithms or tools.

  2. Triangulation in rewriting

    NARCIS (Netherlands)

    Oostrom, V. van; Zantema, Hans

    2012-01-01

    We introduce a process, dubbed triangulation, turning any rewrite relation into a confluent one. It is more direct than usual completion, in the sense that objects connected by a peak are directly oriented rather than their normal forms. We investigate conditions under which this process preserves

  3. Equational term graph rewriting

    NARCIS (Netherlands)

    Z.M. Ariola (Zena); J.W. Klop (Jan Willem)

    1995-01-01

    textabstractWe present an equational framework for term graph rewriting with cycles. The usual notion of homomorphism is phrased in terms of the notion of bisimulation, which is well-known in process algebra and concurrency theory. Specifically, a homomorphism is a functional bisimulation. We prove

  4. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  5. Partial order infinitary term rewriting

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    with the partial order model restricted to total terms. Hence, partial order convergence constitutes a conservative extension of metric convergence, which additionally offers a fine-grained distinction between different levels of divergence. In the second part, we focus our investigation on strong convergence...... of orthogonal systems. The main result is that the gap between the metric model and the partial order model can be bridged by extending the term rewriting system by additional rules. These extensions are the well-known Böhm extensions. Based on this result, we are able to establish that -- contrary...

  6. Trajectory Based Behavior Analysis for User Verification

    Science.gov (United States)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  7. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  8. A New Look at Generalized Rewriting in Type Theory

    Directory of Open Access Journals (Sweden)

    Matthieu Sozeau

    2009-01-01

    Full Text Available Rewriting is an essential tool for computer-based reasoning, both automated and assisted. This is because rewriting is a general notion that permits modeling a wide range of problems and provides a means to effectively solve them. In a proof assistant, rewriting can be used to replace terms in arbitrary contexts, generalizing the usual equational reasoning to reasoning modulo arbitrary relations. This can be done provided the necessary proofs that functions appearing in goals are congruent with respect to specific relations. We present a new implementation of generalized rewriting in the Coq proof assistant, making essential use of the expressive power of dependent types and the recently implemented type class mechanism. The new rewrite tactic improves on and generalizes previous versions by natively supporting higher-order functions, polymorphism and subrelations. The type class system inspired by Haskell provides a perfect interface between the user and the tactic, making it easily extensible.

  9. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  10. Mapping Modular SOS to Rewriting Logic

    DEFF Research Database (Denmark)

    Braga, Christiano de Oliveira; Haeusler, Edward Hermann; Meseguer, José

    2003-01-01

    Modular SOS (MSOS) is a framework created to improve the modularity of structural operational semantics specifications, a formalism frequently used in the fields of programming languages semantics and process algebras. With the objective of defining formal tools to support the execution and verif......Modular SOS (MSOS) is a framework created to improve the modularity of structural operational semantics specifications, a formalism frequently used in the fields of programming languages semantics and process algebras. With the objective of defining formal tools to support the execution...... and verification of MSOS specifications, we have defined a mapping, named , from MSOS to rewriting logic (RWL), a logic which has been proposed as a logical and semantic framework. We have proven the correctness of and implemented it as a prototype, the MSOS-SL Interpreter, in the Maude system, a high...

  11. Lazy rewriting on eager machinery

    NARCIS (Netherlands)

    J.F.T. Kamperman; H.R. Walters (Pum)

    1994-01-01

    textabstractWe define Lazy Term Rewriting Systems and show that they can be realized by local adaptations of an eager implementation of conventional term rewriting systems. The overhead of lazy evaluation is only incurred when lazy evaluation is actually performed. Our method is modelled by a

  12. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  13. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  14. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    Science.gov (United States)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  15. Termination of dependently typed rewrite rules

    OpenAIRE

    Jouannaud, Jean-Pierre; Li, Jian-Qi

    2015-01-01

    International audience; Our interest is in automated termination proofs of higher-order rewrite rules in presence of dependent types modulo a theory T on base types. We first describe an original transformation to a type discipline without type dependencies which preserves non-termination. Since the user must reason on expressions of the transformed language, we then introduce an extension of the computability path ordering CPO for comparing dependently typed expressions named DCPO. Using the...

  16. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  17. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  18. Rewriting Logic Semantics of a Plan Execution Language

    Science.gov (United States)

    Dowek, Gilles; Munoz, Cesar A.; Rocha, Camilo

    2009-01-01

    The Plan Execution Interchange Language (PLEXIL) is a synchronous language developed by NASA to support autonomous spacecraft operations. In this paper, we propose a rewriting logic semantics of PLEXIL in Maude, a high-performance logical engine. The rewriting logic semantics is by itself a formal interpreter of the language and can be used as a semantic benchmark for the implementation of PLEXIL executives. The implementation in Maude has the additional benefit of making available to PLEXIL designers and developers all the formal analysis and verification tools provided by Maude. The formalization of the PLEXIL semantics in rewriting logic poses an interesting challenge due to the synchronous nature of the language and the prioritized rules defining its semantics. To overcome this difficulty, we propose a general procedure for simulating synchronous set relations in rewriting logic that is sound and, for deterministic relations, complete. We also report on the finding of two issues at the design level of the original PLEXIL semantics that were identified with the help of the executable specification in Maude.

  19. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  20. Verification: an enabler for model based data preparation

    Science.gov (United States)

    Schiavone, Patrick; Chagoya, Alexandre; Martin, Luc; Annezo, Vincent; Blanchemain, Alexis

    2013-06-01

    With the technology node progress, the requirements on mask data preparation become more and more stringent. Standard long range dose modulation starts showing difficulties to meet the specifications in terms of correction accuracy and the so called Model Based Data Preparation (MBDP) is gaining more and more interest in order to maintain the required pattern fidelity. This type of correction which often includes a geometry change on top of the dose modulation cannot be checked conventionally using standard Mask Rule Check software tools. A new methodology and software tool to perform verification after Model Based e-beam Proximity Correction is presented to overcome this issue. A basic functionality is to do verification at the shot level taking into account the possible movement of the edges as well as the dose assignment. A second brick allows going one step further: a Model-Based Verification is performed all over the edges of the design, checking by simulation the deviation of the printed pattern to the target after correction. The verification tool is capable to identify hot spots as well as deviations to the targeted design occurring with a very low frequency, making it almost impossible to spot without the systematic use of a verification tool. The verification can be inserted either in the maskshop flow or at the semiconductor manufacturer as a help for improving the OPC flow or as an complementary check to be run with the OPC check.

  1. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  2. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  3. Optical secure image verification system based on ghost imaging

    Science.gov (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  4. New photoresponsible polymers based on the polymerisable azo-diphenyldiacetylene (AZ-DPDA) liquid crystalline monomers for rewritable holograms (Conference Presentation)

    Science.gov (United States)

    Kim, Jinsoo; Ka, Jae-Won; Kim, Yun Ho; Kim, Yeong-Joon; Seo, Young Beom

    2017-02-01

    The development of high performance and large area photoresponsive materials for hologram have been one of the great challenges in order to realize holographic 3D display technology which needs no special eyewear. Desirable hologram materials should provide the high diffraction efficiency, fast response, high resolution, stable and reversible storage, low-energy consuming in the recording and reading processes as well as easy mass production. Azobenzene-containing polymers has been recognized as one of the promising candidate materials for holography because they can modulate effectively due to the photosensitivity and reversibility of azo moieties. In addition, polymer systems have several advantages such as simple fabrication, flexibility, thermal stability, and large scale production. It has been reported that highly birefringent azotolan-containing liquid crystalline polymer (LCP) film can induce a large change in refractive index upon exposure to actinic light. Analogously, we prepared new photochromic polymers based on the polymerisable liquid crystalline acrylate monomers (RMs) containing azo and highly birefringent diphenyldiacetylen (DPDA) mesogenic units connected directly. Evaluation of new polymers for rewritable hologram media will be discussed.

  5. A Method for Automatic Runtime Verification of Automata-Based Programs

    OpenAIRE

    Oleg, Stepanov; Anatoly, Shalyto

    2008-01-01

    Currently Model Checking is the only practically used method for verification of automata-based programs. However, current implementations of this method only allow verification of simple automata systems. We suggest using a different approach, runtime verification, for verification of automata systems. We discuss advantages and disadvantages of this approach, propose a method for automatic verification of automata-based programs which uses this approach and conduct experimental performance s...

  6. Lifting Term Rewriting Derivations in Constructor Systems by Using Generators

    Directory of Open Access Journals (Sweden)

    Adrián Riesco

    2015-01-01

    Full Text Available Narrowing is a procedure that was first studied in the context of equational E-unification and that has been used in a wide range of applications. The classic completeness result due to Hullot states that any term rewriting derivation starting from an instance of an expression can be "lifted" to a narrowing derivation, whenever the substitution employed is normalized. In this paper we adapt the generator- based extra-variables-elimination transformation used in functional-logic programming to overcome that limitation, so we are able to lift term rewriting derivations starting from arbitrary instances of expressions. The proposed technique is limited to left-linear constructor systems and to derivations reaching a ground expression. We also present a Maude-based implementation of the technique, using natural rewriting for the on-demand evaluation strategy.

  7. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public......In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...

  8. Analytical learning and term-rewriting systems

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  9. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  10. A novel methodology for model-based OPC verification

    Science.gov (United States)

    Huang, Tengyen; Liao, ChunCheng; Chou, Ryan; Liao, Hung-Yueh; Schacht, Jochen

    2008-03-01

    Model-based optical proximity correction (OPC) is an indispensable production tool enabling successful extension of photolithography down to sub-80nm regime. Commercial OPC software has established clear procedures to produce accurate OPC models at best focus condition. However, OPC models calibrated at best focus condition sometimes fail to prevent catastrophic circuit failure due to patterning short & open caused by accidental shifts of dose/ focus within the corners of allowed processes window. A novel model-based OPC verification methodology is presented in this work, which precisely pinpoints post OPC photolithography failures in VLSI circuits through the entire lithographic process window. By application of a critical photolithography process window model in OPC verification software, we successfully uncovered all weak points of a design prior tape out, eliminating high risk of circuits open & shorts at the extreme corner of the lithographic process window in any complex circuit layout environment. The process window-related information is usually not taken into consideration when running OPC verification procedures with models calibrated at nominal process condition. Intensive review of the critical dimension (CD) and top-view SEM micrographs from the weak points indicate matching between post OPC simulation and measurements. Using a single highly accurate process window resist model provides a reliable OPC verification methodology when used in a field- or grid-based simulation engine ensuring manufacturability within the largest possible process window for any modern critical design.

  11. Medical data transformation using rewriting

    OpenAIRE

    Ashish, Naveen; Toga, Arthur W.

    2015-01-01

    This paper presents a system for declaratively transforming medical subjects' data into a common data model representation. Our work is part of the “GAAIN” project on Alzheimer's disease data federation across multiple data providers. We present a general purpose data transformation system that we have developed by leveraging the existing state-of-the-art in data integration and query rewriting. In this work we have further extended the current technology with new formalisms that facilitate e...

  12. Rewriting, Ideology, and Poetics in Goldblatt's Translation of Mo Yan's 天堂蒜薹之歌 (The Garlic Ballads)"

    OpenAIRE

    Du, Ping; Zhang, Lili

    2015-01-01

    In their article "Rewriting, Ideology, and Poetics in Goldblatt's Translation of Mo Yan's 天堂蒜薹之歌 (The Garlic Ballads)" Ping Du and Lili Zhang analyze Howard Goldblatt's translation of the novel in order to explore literary "rewriting" in translation. Du and Zhang posit that Goldblatt's translation reflects ideology in concealing, discarding, rewriting, and even losing some part in his translation. Further, they argue that the translation of the novel has beenperformed based on specific aspect...

  13. Specification and Verification of an Agent-Based Auction Service

    Science.gov (United States)

    Badica, Amelia; Badica, Costin

    In this chapter we propose a rigorous modelling and analysis of complex interactions occurring between providers and users of an agent-based English auction service. In our model several auctions initiated by different seller agents are carried out in parallel. Buyer agents can dynamically decide to register for participation in auctions that match their goals. Our approach is based on conceptualising these interactions by formal specification using FSP process algebra and formal verification using FLTL temporal logic.

  14. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  15. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  16. Hamming Code Based Watermarking Scheme for 3D Model Verification

    OpenAIRE

    Jen-Tse Wang; Yi-Ching Chang; Chun-Yuan Yu; Shyr-Shen Yu

    2014-01-01

    Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB) substitution technique is employed for watermark embedding. In the extraction stage, the hamming code ba...

  17. From infinitary term rewriting to cyclic term graph rewriting and back

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2011-01-01

    Cyclic term graph rewriting has been shown to be adequate for simulating certain forms of infinitary term rewriting. These forms are, however, quite restrictive and it would be beneficial to lift these restriction at least for a limited class of rewriting systems. In order to better understand...... the correspondences between infinite reduction sequences over terms and finite reductions over cyclic term graphs, we explore different variants of infinitary term graph rewriting calculi. To this end, we study different modes of convergence for term graph rewriting that generalise the modes of convergence usually...... considered in infinitary term rewriting. After discussing several different alternatives, we identify a complete semilattice on term graphs and derive from it a complete metric space on term graphs. Equipped with these structures, we can -- analogously to the term rewriting case -- define both a metric...

  18. The Rewrite Rule Machine Project

    Science.gov (United States)

    1989-05-01

    overlap among their instructions; that is, large, complex computations tend to be globally inhomogeneou.. SIMD architectures can be very inefficient in...matched rule. 4 fibo + f #)>0 tb /+ \\ => 0 x 0 fiofibo fib ++ S 0 1 1Xx S Y 0 y x (a) (b) ribo + SS S + 0 00 0 0 (C) Figure 1: Rewrite Rules for Fibonacci ...simple example, the Fibonacci function, as defined by the equations ’This restriction can be relaxed to implement so-called perpetual processes. 5 fibo(0

  19. Obtaining a minimal set of rewrite rules

    CSIR Research Space (South Africa)

    Davel, M

    2005-11-01

    Full Text Available In this paper the authors describe a new approach to rewrite rule extraction and analysis, using Minimal Representation Graphs. This approach provides a mechanism for obtaining the smallest possible rule set – within a context-dependent rewrite rule...

  20. Retinal Verification Using a Feature Points-Based Biometric Pattern

    Directory of Open Access Journals (Sweden)

    M. Ortega

    2009-01-01

    Full Text Available Biometrics refer to identity verification of individuals based on some physiologic or behavioural characteristics. The typical authentication process of a person consists in extracting a biometric pattern of him/her and matching it with the stored pattern for the authorised user obtaining a similarity value between patterns. In this work an efficient method for persons authentication is showed. The biometric pattern of the system is a set of feature points representing landmarks in the retinal vessel tree. The pattern extraction and matching is described. Also, a deep analysis of similarity metrics performance is presented for the biometric system. A database with samples of retina images from users on different moments of time is used, thus simulating a hard and real environment of verification. Even in this scenario, the system allows to establish a wide confidence band for the metric threshold where no errors are obtained for training and test sets.

  1. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    We present ParTypes, a type-based methodology for the verification of Message Passing Interface (MPI) programs written in the C programming language. The aim is to statically verify programs against protocol specifications, enforcing properties such as fidelity and absence of deadlocks. We develo......, that suffer from the state-explosion problem or that otherwise depend on parameters to the program itself. We experimentally evaluated our approach against state-of-the-art tools for MPI to conclude that our approach offers a scalable solution....... translated into a representation read by VCC, a software verifier for C. We successfully verified several MPI programs in a running time that is independent of the number of processes or other input parameters. This contrasts with alternative techniques, notably model checking and runtime verification...

  2. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  3. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    problems for logics capable of expressing strategic abilities of players in games with both qualitative and quantitative objectives. A number of computational complexity results for model-checking and satisfiability problems in this domain are obtained. We also show how the technique of symmetry reduction...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm...

  4. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  5. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  6. Bounded Semantics of CTL and SAT-Based Verification

    Science.gov (United States)

    Zhang, Wenhui

    Bounded model checking has been proposed as a complementary approach to BDD based symbolic model checking for combating the state explosion problem, esp. for efficient error detection. This has led to a lot of successful work with respect to error detection in the checking of LTL, ACTL (the universal fragment of CTL) and ACTL* properties by satisfiability testing. The use of bounded model checking for verification (in contrast to error detection) of LTL and ACTL properties has later also been studied. This paper studies the potentials and limitations of bounded model checking for the verification of CTL and CTL* formulas. On the theoretical side, we first provide a framework for discussion of bounded semantics, which serves as the basis for bounded model checking, then extend the bounded semantics of ACTL to a bounded semantics of CTL, and discuss the limitation of developing such a bounded semantics for CTL*. On the practical side, a deduction of a SAT-based bounded model checking approach for ACTL properties from the bounded semantics of CTL is demonstrated, and a comparison of such an approach with BDD-based model checking is presented based on experimental results.

  7. A scenario-based verification technique to assess the compatibility of collaborative business processes.

    OpenAIRE

    De Backer, Manu; Snoeck, Monique; Monsieur, Geert; Lemahieu, Wilfried; Dedene, Guido

    2009-01-01

    Successful E-Business is based on seamless collaborative business processes. Each partner in the collaboration specifies its own rules and interaction preconditions. The verification of the compatibility of collaborative business processes, based on local and global views, is a complex task, which is critical for the success of the cooperation. The verification of process compatibility should be a key element in the design of new business alliances, which makes this verification essential in ...

  8. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  9. Efficient Data Integrity Verification Using CRC Based on HDFS in Cloud Storage

    Directory of Open Access Journals (Sweden)

    Xia Yun-Hao

    2017-01-01

    Full Text Available Data integrity verification is becoming a major challenge in cloud storage which can’t be ignored. This paper proposes an optimized variant of CRC (Checker Redundancy Cyclic verification algorithm based on HDFS to improve the efficiency of data integrity verification in cloud storage through the research of CRC checksum algorithm and data integrity verification mechanism of HDFS. A new method is formulated to establish the deformational optimization and to accelerate the algorithm by researching characteristics of generating and checking the algorithm. Moreover, this method optimizes the code to improve the computational efficiency according to data integrity verification mechanism of HDFS. A data integrity verification system based on Hadoop is designed to verify proposed method. Experimental results demonstrate that proposed HDFS based CRC algorithm was able to improve the calculation efficiency and the utilization of system resource on the whole and outperformed well compared to existing models in terms of accuracy and time.

  10. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  11. An XQDD-Based Verification Method for Quantum Circuits

    Science.gov (United States)

    Wang, Shiou-An; Lu, Chin-Yung; Tsai, I.-Ming; Kuo, Sy-Yen

    Synthesis of quantum circuits is essential for building quantum computers. It is important to verify that the circuits designed perform the correct functions. In this paper, we propose an algorithm which can be used to verify the quantum circuits synthesized by any method. The proposed algorithm is based on BDD (Binary Decision Diagram) and is called X-decomposition Quantum Decision Diagram (XQDD). In this method, quantum operations are modeled using a graphic method and the verification process is based on comparing these graphic diagrams. We also develop an algorithm to verify reversible circuits even if they have a different number of garbage qubits. In most cases, the number of nodes used in XQDD is less than that in other representations. In general, the proposed method is more efficient in terms of space and time and can be used to verify many quantum circuits in polynomial time.

  12. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security... Service. SUMMARY: We provide limited fee-based Social Security number (SSN) verification service to... CONTACT: Gerard R. Hart, Office of Public Service and Operations Support, Social Security Administration...

  13. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  14. Values Education through Aggadic Stories: The Didactic Rewriter as Interpreter

    Science.gov (United States)

    Weinstein, Sara

    2016-01-01

    Didactic rewrites of aggadic stories are an important resource in values education. This study, geared primarily toward teachers involved in choosing curricular materials, investigates how the didactic rewriter actually becomes an interpreter, rather than a mere transmitter, of the original text. The personal values of the rewriters can influence…

  15. Parallel object-oriented term rewriting : the booleans

    NARCIS (Netherlands)

    Rodenburg, P.H.; Vrancken, J.L.M.

    As a first case study in parallel object-oriented term rewriting, we give two implementations of term rewriting algorithms for boolean terms, using the parallel object-oriented features of the language Pool-T. The term rewriting systems are specified in the specification formalism

  16. Shakespeare and the classics: metamorphoses and rewriting

    OpenAIRE

    Bebiano, Adriana

    2003-01-01

    This essay starts off from the idea that all creative writing is rewriting and appropriation of previous texts. It then focuses on Shakespeare’s use of Apuleius’ Golden Ass, in A Midsummer Night’s Dream, and of the encounter between Ulysses and Nausicaa in Odyssey, transformed into the encounter between Ferdinand and Miranda in The Tempest. Rewriting of old stories in a new configuration is seen as productive translation between cultures. Neil Gaiman’s graphic novel The Sandman is given as an...

  17. Aspects of the teaching practice in the horror narratives's revision and rewriting

    Directory of Open Access Journals (Sweden)

    Denise Moreira Gasparotto

    2015-09-01

    Full Text Available This study was focused on recension and rewriting in teacher’s practice with students of the 4th and 5th grades of an elementary school in the Northwest region of Paraná State, Brazil, considering the writing pieces made in class about the Horror Story discursive genre. Conceiving rewriting as work, and based on Bakhtin’s Circle of dialogic assumptions concerning dialogism, the responsiveness of the written discourse, genres, and in studies on Applied Linguistics about text revision and rewriting, we observed the guided practice of a teacher on the text revision and rewriting processes of students in this situation. The characterization of the Horror Story genre was our main focus, proving that the notes on rewriting should be observed from this assumption. The records collection was carried out during the second half of the year 2012, after a collaborative theoretical-methodological intervention with the teacher, providing theoretical subsidies and guided discussions in order to subsidize the work’s proposed comprehension, and evaluate the ongoing actions. The results proved that: a the internalization of the theoretical-methodological assumptions happens when a single discursive genre is into focus; b the guidance and the attendance by the teacher are necessary for the formation process; c an improvement was observed in the students’ writings; d there is a necessity of development and improvement of proper revision and rewriting strategies towards the analyzed genre; and e the work done with text revision and rewriting in this specific textual genre seemed to be more effective due to the necessities arising from this enunciation.

  18. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  19. Verification and Planning Based on Coinductive Logic Programming

    Science.gov (United States)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  20. Increasing the rewriting speed of optical rewritable e-paper by selecting proper liquid crystals

    Science.gov (United States)

    Geng, Yu; Sun, Jiatong; Anatoli, Murauski; Vladimir, Chigrinov; Kwok Hoi, Sing

    2012-08-01

    The effect of interaction between liquid crystal (LC) and photoalignment material on the speed of optical rewriting process is investigated. The theoretical analysis shows that a smaller frank elastic constant K22 of liquid crystal corresponds to a larger twist angle, which gives rise to a larger rewriting speed. Six different LC cells with the same boundary conditions (one substrate is covered with rubbed polyimide (PI) and the other with photo sensitive rewritable sulfuric dye 1(SD1)) are tested experimentally under the same illumination intensity (450 nm, 80 mW/cm2). The results demonstrate that with a suitable liquid crystal, the LC optical rewriting speed for e-paper application can be obviously improved. For two well known LC materials E7 (K22 is larger) and 5CB (K22 is smaller), they require 11 s and 6 s corresponding to change alignment direction for generating image information.

  1. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...... outperforms or is comparable to state-of-the-art kinship verification methods. Copyright ©2015 by IEEE....

  2. On Graph Rewriting, Reduction and Evaluation

    DEFF Research Database (Denmark)

    Zerny, Ian

    2009-01-01

    We inter-derive two prototypical styles of graph reduction: reduction machines à la Turner and graph rewriting systems à la Barendregt. To this end, we adapt Danvy et al.'s mechanical program derivations from the world of terms to the world of graphs. We also inter-derive a graph evaluator....

  3. On Graph Rewriting, Reduction and Evaluation

    DEFF Research Database (Denmark)

    Zerny, Ian

    2010-01-01

    We inter-derive two prototypical styles of graph reduction: reduction machines à la Turner and graph rewriting systems à la Barendregt et al. To this end, we adapt Danvy et al.'s mechanical program derivations from the world of terms to the world of graphs. We also outline how to inter......-derive a third style of graph reduction: a graph evaluator....

  4. Match-bounded String Rewriting Systems

    Science.gov (United States)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2003-01-01

    We introduce a new class of automated proof methods for the termination of rewriting systems on strings. The basis of all these methods is to show that rewriting preserves regular languages. To this end, letters are annotated with natural numbers, called match heights. If the minimal height of all positions in a redex is h+1 then every position in the reduct will get height h+1. In a match-bounded system, match heights are globally bounded. Using recent results on deleting systems, we prove that rewriting by a match-bounded system preserves regular languages. Hence it is decidable whether a given rewriting system has a given match bound. We also provide a sufficient criterion for the abence of a match-bound. The problem of existence of a match-bound is still open. Match-boundedness for all strings can be used as an automated criterion for termination, for match-bounded systems are terminating. This criterion can be strengthened by requiring match-boundedness only for a restricted set of strings, for instance the set of right hand sides of forward closures.

  5. Mark formation modeling in optical rewritable recording

    NARCIS (Netherlands)

    Brusche, J.H.; Segal, A.; Vuik, C.; Urbach, H.P.

    2006-01-01

    In optical rewritable recording media, such as the Blu-ray Disc, amorphous marks are formed on a crystalline background of a phase-change layer, by means of short, high power laser pulses. In order to improve this data storage concept, it is of great importance to understand the mark formation

  6. Verification of the Simultaneous Local Extraction Method of Base and Thermal Resistance of Bipolar Transistors

    OpenAIRE

    Robert Setekera; Luuk Tiemeijer; Ramses van der Toorn

    2014-01-01

    In this paper an extensive verification of the extraction method (published earlier) that consistently accounts for self-heating and Early effect to accurately extract both base and thermal resistance of bipolar junction transistors is presented. The method verification is demonstrated on advanced RF SiGe HBTs were the extracted results for the thermal resistance are compared with those from another published method that ignores the effect of Early effect on internal base...

  7. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  8. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state of the art Horn clause verification tools....

  9. Termination of canonical context-sensitive rewriting and productivity of rewrite systems

    Directory of Open Access Journals (Sweden)

    Salvador Lucas

    2015-12-01

    Full Text Available Termination of programs, i.e., the absence of infinite computations, ensures the existence of normal forms for all initial expressions, thus providing an essential ingredient for the definition of a normalization semantics for functional programs. In lazy functional languages, though, infinite data structures are often delivered as the outcome of computations. For instance, the list of all prime numbers can be returned as a neverending stream of numerical expressions or data structures. If such streams are allowed, requiring termination is hopeless. In this setting, the notion of productivity can be used to provide an account of computations with infinite data structures, as it "captures the idea of computability, of progress of infinite-list programs" (B.A. Sijtsma, On the Productivity of Recursive List Definitions, ACM Transactions on Programming Languages and Systems 11(4:633-649, 1989. However, in the realm of Term Rewriting Systems, which can be seen as (first-order, untyped, unconditional functional programs, termination of Context-Sensitive Rewriting (CSR has been showed equivalent to productivity of rewrite systems through appropriate transformations. In this way, tools for proving termination of CSR can be used to prove productivity. In term rewriting, CSR is the restriction of rewriting that arises when reductions are allowed on selected arguments of function symbols only. In this paper we show that well-known results about the computational power of CSR are useful to better understand the existing connections between productivity of rewrite systems and termination of CSR, and also to obtain more powerful techniques to prove productivity of rewrite systems.

  10. Resistive switching effect in the planar structure of all-printed, flexible and rewritable memory device based on advanced 2D nanocomposite of graphene quantum dots and white graphene flakes

    Science.gov (United States)

    Muqeet Rehman, Muhammad; Uddin Siddiqui, Ghayas; Kim, Sowon; Choi, Kyung Hyun

    2017-08-01

    Pursuit of the most appropriate materials and fabrication methods is essential for developing a reliable, rewritable and flexible memory device. In this study, we have proposed an advanced 2D nanocomposite of white graphene (hBN) flakes embedded with graphene quantum dots (GQDs) as the functional layer of a flexible memory device owing to their unique electrical, chemical and mechanical properties. Unlike the typical sandwich type structure of a memory device, we developed a cost effective planar structure, to simplify device fabrication and prevent sneak current. The entire device fabrication was carried out using printing technology followed by encapsulation in an atomically thin layer of aluminum oxide (Al2O3) for protection against environmental humidity. The proposed memory device exhibited attractive bipolar switching characteristics of high switching ratio, large electrical endurance and enhanced lifetime, without any crosstalk between adjacent memory cells. The as-fabricated device showed excellent durability for several bending cycles at various bending diameters without any degradation in bistable resistive states. The memory mechanism was deduced to be conductive filamentary; this was validated by illustrating the temperature dependence of bistable resistive states. Our obtained results pave the way for the execution of promising 2D material based next generation flexible and non-volatile memory (NVM) applications.

  11. Fusion of PCA-Based and LDA-Based Similarity Measures for Face Verification

    Directory of Open Access Journals (Sweden)

    Kittler Josef

    2010-01-01

    Full Text Available The problem of fusing similarity measure-based classifiers is considered in the context of face verification. The performance of face verification systems using different similarity measures in two well-known appearance-based representation spaces, namely Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA is experimentally studied. The study is performed for both manually and automatically registered face images. The experimental results confirm that our optimised Gradient Direction (GD metric within the LDA feature space outperforms the other adopted metrics. Different methods of selection and fusion of the similarity measure-based classifiers are then examined. The experimental results demonstrate that the combined classifiers outperform any individual verification algorithm. In our studies, the Support Vector Machines (SVMs and Weighted Averaging of similarity measures appear to be the best fusion rules. Another interesting achievement of the work is that although features derived from the LDA approach lead to better results than those of the PCA algorithm for all the adopted scoring functions, fusing the PCA- and LDA-based scores improves the performance of the system.

  12. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show...

  13. An Illumination Independent Face Verification Based on Gabor Wavelet and Supported Vector Machine

    Science.gov (United States)

    Zhang, Xingming; Liu, Dian; Chen, Jianfu

    Face verification technology is widely used in the fields of public safety, e-commerce and so on. Due its characteristic of insensitive to the varied illumination, a new method about face verification with illumination invariant is presented in this paper based on gabor wavelet. First, ATICR method is used to do light preprocessing on images. Second, certain gabor wavelet filters, which are selected on the experiment inducing different gagor wavelet filter has not the same effect in verification, are used to extract feature of the image, of which the dimension in succession is reduced by Principal Component Analysis. At last, SVM classifiers are modeled on the data with reduced dimension. The experiment results in IFACE database and NIRFACE database indicate the algorithm named "Selected Paralleled Gabor Method" can achieves higher verification performance and better adaptability to the variable illumination.

  14. Method and computer product to increase accuracy of time-based software verification for sensor networks

    Science.gov (United States)

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  15. Preprocessing Based Verification of Multiparty Protocols with Honest Majority

    Directory of Open Access Journals (Sweden)

    Laud Peeter

    2017-10-01

    Full Text Available This paper presents a generic “GMW-style” method for turning passively secure protocols into protocols secure against covert attacks, adding relatively cheap offline preprocessing and post-execution verification phases. Our construction performs best with a small number of parties, and its main benefit is the total cost of the online and the offline phases. In the preprocessing phase, each party generates and shares a sufficient amount of verified multiplication triples that will be later used to assist that party’s proof. The execution phase, after which the computed result is already available to the parties, has only negligible overhead that comes from signatures on sent messages. In the postprocessing phase, the verifiers repeat the computation of the prover in secret-shared manner, checking that they obtain the same messages that the prover sent out during execution. The verification preserves the privacy guarantees of the original protocol. It is applicable to protocols doing computations over finite rings, even if the same protocol performs its computation over several distinct rings. We apply our verification method to the Sharemind platform for secure multiparty computations (SMC, evaluate its performance and compare it to other existing SMC platforms offering security against stronger than passive attackers.

  16. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  17. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  18. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  19. Complete Functional Verification

    OpenAIRE

    Bormann, Joerg (Dr.)

    2017-01-01

    The dissertation describes a practically proven, particularly efficient approach for the verification of digital circuit designs. The approach outperforms simulation based verification wrt. final circuit quality as well as wrt. required verification effort. In the dissertation, the paradigm of transaction based verification is ported from simulation to formal verification. One consequence is a particular format of formal properties, called operation properties. Circuit descriptions are verifi...

  20. Translation of State Machines from Equational Theories into Rewrite Theories with Tool Support

    National Research Council Canada - National Science Library

    ZHANG, Min; OGATA, Kazuhiro; NAKAMURA, Masaki

    2011-01-01

    This paper presents a strategy together with tool support for the translation of state machines from equational theories into rewrite theories, aiming at automatically generating rewrite theory specifications...

  1. Lifting infinite normal form definitions from term rewriting to term graph rewriting

    NARCIS (Netherlands)

    S.C.C. Blom (Stefan)

    2002-01-01

    textabstractInfinite normal forms are a way of giving semantics to non-terminating rewrite systems. The notion is a generalization of the Boehm tree in the lambda calculus. It was first introduced in [AB97] to provide semantics for a lambda calculus on terms with letrec. In that paper infinite

  2. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  3. Partial order infinitary term rewriting and Böhm trees

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2010-01-01

    We investigate an alternative model of infinitary term rewriting. Instead of a metric, a partial order on terms is employed to formalise (strong) convergence. We compare this partial order convergence of orthogonal term rewriting systems to the usual metric convergence of the corresponding B{"o}h...

  4. Parallel Execution of Multi Set Constraint Rewrite Rules

    DEFF Research Database (Denmark)

    Sulzmann, Martin; Lam, Edmund Soon Lee

    2008-01-01

    Multi-set constraint rewriting allows for a highly parallel computational model and has been used in a multitude of application domains such as constraint solving, agent specification etc. Rewriting steps can be applied simultaneously as long as they do not interfere with each other.We wish that ...

  5. Formal logic rewrite system bachelor in teaching mathematical informatics

    Science.gov (United States)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  6. Characterizing Languages by Normalization and Termination in String Rewriting

    NARCIS (Netherlands)

    Ketema, J.; Simonsen, Jakob Grue; Yen, Hsu-Chun; Ibarra, Oscar H.

    We characterize sets of strings using two central properties from rewriting: normalization and termination. We recall the well-known result that any recursively enumerable set of strings can occur as the set of normalizing strings over a “small‿ alphabet if the rewriting system is allowed access to

  7. Developing Reading and Listening Comprehension Tests Based on the Sentence Verification Technique (SVT).

    Science.gov (United States)

    Royer, James M.

    2001-01-01

    Describes a team-based approach for creating Sentence Verification Technique (SVT) tests, a development procedure that allows teachers and other school personnel to develop comprehension tests from curriculum materials in use in their schools. Finds that if tests are based on materials that are appropriate for the population to be tested, the…

  8. A scenario-based verification technique to assess the compatibility of collaborative business processes

    NARCIS (Netherlands)

    De Backer, M.; Snoeck, M.; Monsieur, G.; Lemahieu, W.; Dedene, G.

    2009-01-01

    Successful E-Business is based on seamless collaborative business processes. Each partner in the collaboration specifies its own rules and interaction preconditions. The verification of the compatibility of collaborative business processes, based on local and global views, is a complex task, which

  9. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ... ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security Administration. ACTION: Notice of Revised Transaction Fee for CBSV Service. SUMMARY: We provide fee-based Social...-6401, , for more information about the CBSV service, visit our Internet site, Social Security Online...

  10. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    have been implemented in the UPPAAL tool and built as a tool chain, respectively. We try out the prototype verification tools on a number of examples and case studies. Experimental results indicate that these methods are viable, computationally feasible,and the tools are effective.......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  11. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Directory of Open Access Journals (Sweden)

    Jingzhen Li

    2017-01-01

    Full Text Available In this paper, an approach to biometric verification based on human body communication (HBC is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA. Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR and false rejection rate (FRR based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN classification, support vector machines (SVM, and naive Bayesian method (NBM classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  12. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval...... to discriminate speakers or pass-phrases or phones or a combination of them. In the context of speaker verification, speech data of fixed pass-phrases are used for TCL-BN training, while the pass-phrases used for TCL-BN training are excluded from being used for SV, so that the learned features can be considered...... generic. The method is evaluated on the RedDots Challenge 2016 database. Experimental results show that TCL-BN is superior to the existing speaker and pass-phrase discriminant BN features and the Mel-frequency cepstral coefficient feature for text-dependent speaker verification....

  13. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based...

  14. From "Somatic Scandals" to "A Constant Potential for Violence"? The Culture of Dissection, Brain-Based Learning, and the Rewriting/Rewiring of "The Child"

    Science.gov (United States)

    Baker, Bernadette

    2015-01-01

    Within educational research across Europe and the US, one of the most rapidly traveling discourses and highly funded pursuits of the moment is brain-based learning (BBL). BBL is an approach to curriculum and pedagogical decision-making that is located within the new field of educational neuroscience. In some strands of BBL research the structure…

  15. Ensemble-based approximation of observation impact using an observation-based verification metric

    Directory of Open Access Journals (Sweden)

    Matthias Sommer

    2016-07-01

    Full Text Available Knowledge on the contribution of observations to forecast accuracy is crucial for the refinement of observing and data assimilation systems. Several recent publications highlighted the benefits of efficiently approximating this observation impact using adjoint methods or ensembles. This study proposes a modification of an existing method for computing observation impact in an ensemble-based data assimilation and forecasting system and applies the method to a pre-operational, convective-scale regional modelling environment. Instead of the analysis, the modified approach uses observation-based verification metrics to mitigate the effect of correlation between the forecast and its verification norm. Furthermore, a peculiar property in the distribution of individual observation impact values is used to define a reliability indicator for the accuracy of the impact approximation. Applying this method to a 3-day test period shows that a well-defined observation impact value can be approximated for most observation types and the reliability indicator successfully depicts where results are not significant.

  16. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  17. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  18. Electronic portal images (EPIs) based position verification for the breast simultaneous integrated boost (SIB) technique

    NARCIS (Netherlands)

    Sijtsema, Nanna M; van Dijk-Peters, Femke B J; Langendijk, Johannes a; Maduro, John H; van 't Veld, Aart a

    Background and purpose: To develop a method based on electronic portal images (EPIs) for the position verification of breast cancer patients that are treated with a simultaneous integrated boost (SIB) technique. Method: 3D setup errors of the breast outline and the thoracic wall were determined from

  19. Compositional verification of knowledge-based systems: A case study for diagnostic reasoning

    NARCIS (Netherlands)

    Cornelissen, F.J.; Jonker, C.M.; Treur, J.

    1997-01-01

    In this paper a compositional verification method for models of knowledge-based systems is introduced. Required properties of the system are formally verified by deriving them from assumptions that themselves are properties of sub-components, which in their turn may be derived from assumptions on

  20. Verification Benchmarks to Assess the Implementation of Computational Fluid Dynamics Based Hemolysis Prediction Models.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin; Horner, Marc; Malinauskas, Richard A; Myers, Matthew R

    2015-09-01

    As part of an ongoing effort to develop verification and validation (V&V) standards for using computational fluid dynamics (CFD) in the evaluation of medical devices, we have developed idealized flow-based verification benchmarks to assess the implementation of commonly cited power-law based hemolysis models in CFD. Verification process ensures that all governing equations are solved correctly and the model is free of user and numerical errors. To perform verification for power-law based hemolysis modeling, analytical solutions for the Eulerian power-law blood damage model (which estimates hemolysis index (HI) as a function of shear stress and exposure time) were obtained for Couette and inclined Couette flow models, and for Newtonian and non-Newtonian pipe flow models. Subsequently, CFD simulations of fluid flow and HI were performed using Eulerian and three different Lagrangian-based hemolysis models and compared with the analytical solutions. For all the geometries, the blood damage results from the Eulerian-based CFD simulations matched the Eulerian analytical solutions within ∼1%, which indicates successful implementation of the Eulerian hemolysis model. Agreement between the Lagrangian and Eulerian models depended upon the choice of the hemolysis power-law constants. For the commonly used values of power-law constants (α  = 1.9-2.42 and β  = 0.65-0.80), in the absence of flow acceleration, most of the Lagrangian models matched the Eulerian results within 5%. In the presence of flow acceleration (inclined Couette flow), moderate differences (∼10%) were observed between the Lagrangian and Eulerian models. This difference increased to greater than 100% as the beta exponent decreased. These simplified flow problems can be used as standard benchmarks for verifying the implementation of blood damage predictive models in commercial and open-source CFD codes. The current study only used power-law model as an illustrative example to emphasize the need

  1. Active pixel and photon counting imagers based on poly-Si TFTs: rewriting the rule book on large area flat panel x-ray devices

    Science.gov (United States)

    Antonuk, Larry E.; Koniczek, Martin; El-Mohri, Youcef; Zhao, Qihua

    2009-02-01

    The near-ubiquity of large area, active matrix, flat-panel imagers (AMFPIs) in medical x-ray imaging applications is a testament to the usefulness and adaptability of the relatively simple concept of array pixels based on a single amorphous silicon (a-Si:H) TFT coupled to a pixel storage capacitor. Interestingly, the fundamental advantages of a-Si:H thin film electronics (including compatibility with very large area processing, high radiation damage resistance, and continued development driven by interest in mainstream consumer products) are shared by the rapidly advancing technology of polycrystalline silicon (poly-Si) TFTs. Moreover, the far higher mobilities of poly-Si TFTs, compared to those of a- Si:H, facilitate the creation of faster and more complex circuits than are possible with a-Si:H TFTs, leading to the possibility of new classes of large area, flat panel imagers. Given recent progress in the development of initial poly-Si imager prototypes, the creation of increasingly sophisticated active pixel arrays offering pixel-level amplification, variable gain, very high frame rates, and excellent signal-to-noise performance under all fluoroscopic and radiographic conditions (including very low exposures and high spatial frequencies), appears within reach. In addition, it is conceivable that the properties of poly-Si TFTs could allow the development of large area imagers providing single xray photon counting capabilities. In this article, the factors driving the possible realization of clinically practical active pixel and photon counting imagers based on poly-Si TFTs are described and simple calculational estimates related to photon counting imagers are presented. Finally, the prospect for future development of such imagers is discussed.

  2. Within ARM's reach : compilation of left-linear rewrite systems via minimalrewrite systems

    NARCIS (Netherlands)

    W.J. Fokkink (Wan); J.F.T. Kamperman; H.R. Walters (Pum)

    1997-01-01

    textabstractA new compilation technique for left-linear term rewriting systems is presented, where rewrite rules are transformed into so-called minimal rewrite rules. These minimal rules have such a simple form that they can be viewed as instructions for an abstract rewriting machine (ARM).

  3. Rewriting Modulo β in the λΠ-Calculus Modulo

    Directory of Open Access Journals (Sweden)

    Ronan Saillard

    2015-07-01

    Full Text Available The lambda-Pi-calculus Modulo is a variant of the lambda-calculus with dependent types where beta-conversion is extended with user-defined rewrite rules. It is an expressive logical framework and has been used to encode logics and type systems in a shallow way. Basic properties such as subject reduction or uniqueness of types do not hold in general in the lambda-Pi-calculus Modulo. However, they hold if the rewrite system generated by the rewrite rules together with beta-reduction is confluent. But this is too restrictive. To handle the case where non confluence comes from the interference between the beta-reduction and rewrite rules with lambda-abstraction on their left-hand side, we introduce a notion of rewriting modulo beta for the lambda-Pi-calculus Modulo. We prove that confluence of rewriting modulo beta is enough to ensure subject reduction and uniqueness of types. We achieve our goal by encoding the lambda-Pi-calculus Modulo into Higher-Order Rewrite System (HRS. As a consequence, we also make the confluence results for HRSs available for the lambda-Pi-calculus Modulo.

  4. Elementary Particle Spectroscopy in Regular Solid Rewrite

    Science.gov (United States)

    Trell, Erik

    2008-10-01

    The Nilpotent Universal Computer Rewrite System (NUCRS) has operationalized the radical ontological dilemma of Nothing at All versus Anything at All down to the ground recursive syntax and principal mathematical realisation of this categorical dichotomy as such and so governing all its sui generis modalities, leading to fulfilment of their individual terms and compass when the respective choice sequence operations are brought to closure. Focussing on the general grammar, NUCRS by pure logic and its algebraic notations hence bootstraps Quantum Mechanics, aware that it "is the likely keystone of a fundamental computational foundation" also for e.g. physics, molecular biology and neuroscience. The present work deals with classical geometry where morphology is the modality, and ventures that the ancient regular solids are its specific rewrite system, in effect extensively anticipating the detailed elementary particle spectroscopy, and further on to essential structures at large both over the inorganic and organic realms. The geodetic antipode to Nothing is extension, with natural eigenvector the endless straight line which when deployed according to the NUCRS as well as Plotelemeian topographic prescriptions forms a real three-dimensional eigenspace with cubical eigenelements where observed quark-skewed quantum-chromodynamical particle events self-generate as an Aristotelean phase transition between the straight and round extremes of absolute endlessness under the symmetry- and gauge-preserving, canonical coset decomposition SO(3)×O(5) of Lie algebra SU(3). The cubical eigen-space and eigen-elements are the parental state and frame, and the other solids are a range of transition matrix elements and portions adapting to the spherical root vector symmetries and so reproducibly reproducing the elementary particle spectroscopy, including a modular, truncated octahedron nano-composition of the Electron which piecemeal enter into molecular structures or compressed to each

  5. Light Printing of Grayscale Pixel Images on Optical Rewritable Electronic Paper

    Science.gov (United States)

    Muravsky, Alexander; Murauski, Anatoli; Chigrinov, Vladimir; Kwok, Hoi-Sing

    2008-08-01

    The new principle of electronic paper that can display two-dimensional (2D) or even stereoscopic three-dimensional (3D) images was developed by us. We review the structure of the display unit that is the light printable rewritable matter with polarization dependent gray scale. It consists of one or two liquid crystal displays based on optical rewritable (ORW) technology. ORW display uses bare plastic or polarizers as substrates, while no conductor is required. Continuous grey image on ORW e-paper maintains proper performance even when the device is bent. The image is changed by light printer. We discuss and experimentally verify the possible design of the light printer device based on polarization rotation. Alternative designs based on twist nematic (TN) or ferroelectric liquid crystals (FLC) operational element are also suggested. Simple construction provides durability and low cost of the ORW e-paper concept, while amazing possibility of 3D picture on ORW electronic paper is principally proved.

  6. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  7. Rewritable azobenzene polyester for polarization holographic data storage

    DEFF Research Database (Denmark)

    Kerekes, A; Sajti, Sz.; Loerincz, Emoeke

    2000-01-01

    Optical storage properties of thin azobenzene side-chain polyester films were examined by polarization holographic measurements. The new amorphous polyester film is the candidate material for the purpose of rewritable holographic memory system. Temporal formation of anisotropic and topographic...

  8. Versions of Homer: Translation, fan fiction, and other transformative rewriting

    Directory of Open Access Journals (Sweden)

    Shannon K. Farley

    2016-03-01

    Full Text Available This article posits a paradigm of transformative work that includes translation, adaptation, and fan fiction using the Homeric epics as a case study. A chronological discussion of translations, other literary rewritings, and fan fiction distinguishes each as belonging to its respective cultural system while participating in a common form of transformative rewriting. Such a close look at the distinctive ways that Homer has been rewritten throughout history helps us to make a scholarly distinction between the work of fan writers and the work of rewriters like Vergil and Alexander Pope. At the same time, discussing the ways in which the forms of their rewritings are similar gives a scholarly basis for arguing that fan fiction participates in the discourse of serious interpretive literature.

  9. Constructors, Sufficient Completeness, and Deadlock Freedom of Rewrite Theories

    Science.gov (United States)

    Rocha, Camilo; Meseguer, José

    Sufficient completeness has been throughly studied for equational specifications, where function symbols are classified into constructors and defined symbols. But what should sufficient completeness mean for a rewrite theory R = (Σ,E,R) with equations E and non-equational rules R describing concurrent transitions in a system? This work argues that a rewrite theory naturally has two notions of constructor: the usual one for its equations E, and a different one for its rules R. The sufficient completeness of constructors for the rules R turns out to be intimately related with deadlock freedom, i.e., R has no deadlocks outside the constructors for R. The relation between these two notions is studied in the setting of unconditional order-sorted rewrite theories. Sufficient conditions are given allowing the automatic checking of sufficient completeness, deadlock freedom, and other related properties, by propositional tree automata modulo equational axioms such as associativity, commutativity, and identity. They are used to extend the Maude Sufficient Completeness Checker from the checking of equational theories to that of both equational and rewrite theories. Finally, the usefulness of the proposed notion of constructors in proving inductive theorems about the reachability rewrite relation →_R associated to a rewrite theory R (and also about the joinability relation downarrow_R) is both characterized and illustrated with an example.

  10. Feasibility of biochemical verification in a web-based smoking cessation study.

    Science.gov (United States)

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  12. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval......, and the TCL method aims to exploit this temporal structure. More specifically, it trains deep neural networks (DNNs) to discriminate temporal events obtained by uniformly segmenting speech signals, in contrast to existing DNN based BN feature extraction methods that train DNNs using labeled data...... to discriminate speakers or pass-phrases or phones or a combination of them. In the context of speaker verification, speech data of fixed pass-phrases are used for TCL-BN training, while the pass-phrases used for TCL-BN training are excluded from being used for SV, so that the learned features can be considered...

  13. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  14. Intelligent Tools for Planning Knowledge base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  15. [Verification of Learning Effects by Team-based Learning].

    Science.gov (United States)

    Ono, Shin-Ichi; Ito, Yoshihisa; Ishige, Kumiko; Inokuchi, Norio; Kosuge, Yasuhiro; Asami, Satoru; Izumisawa, Megumi; Kobayashi, Hiroko; Hayashi, Hiroyuki; Suzuki, Takashi; Kishikawa, Yukinaga; Hata, Harumi; Kose, Eiji; Tabata, Kei-Ichi

    2017-11-01

     It has been recommended that active learning methods, such as team-based learning (TBL) and problem-based learning (PBL), be introduced into university classes by the Central Council for Education. As such, for the past 3 years, we have implemented TBL in a medical therapeutics course for 4-year students. Based upon our experience, TBL is characterized as follows: TBL needs fewer teachers than PBL to conduct a TBL module. TBL enables both students and teachers to recognize and confirm the learning results from preparation and reviewing. TBL grows students' responsibility for themselves and their teams, and likely facilitates learning activities through peer assessment.

  16. Specification and Verification of GPGPU programs using Permission-based Separation logic

    OpenAIRE

    Huisman, Marieke; Mihelcic, M.

    2013-01-01

    Graphics Processing Units (GPUs) are increasingly used for general-purpose applications because of their low price, energy efficiency and enormous computing power. Considering the importance of GPU applications, it is vital that the behaviour of GPU programs can be specified and proven correct formally. This paper presents our ideas how to verify GPU programs written in OpenCL, a platform-independent low-level programming language. Our verification approach is modular, based on permission-bas...

  17. Acoustic-based proton range verification in heterogeneous tissue: simulation studies

    Science.gov (United States)

    Jones, Kevin C.; Nie, Wei; Chu, James C. H.; Turian, Julius V.; Kassaee, Alireza; Sehgal, Chandra M.; Avery, Stephen

    2018-01-01

    Acoustic-based proton range verification (protoacoustics) is a potential in vivo technique for determining the Bragg peak position. Previous measurements and simulations have been restricted to homogeneous water tanks. Here, a CT-based simulation method is proposed and applied to a liver and prostate case to model the effects of tissue heterogeneity on the protoacoustic amplitude and time-of-flight range verification accuracy. For the liver case, posterior irradiation with a single proton pencil beam was simulated for detectors placed on the skin. In the prostate case, a transrectal probe measured the protoacoustic pressure generated by irradiation with five separate anterior proton beams. After calculating the proton beam dose deposition, each CT voxel’s material properties were mapped based on Hounsfield Unit values, and thermoacoustically-generated acoustic wave propagation was simulated with the k-Wave MATLAB toolbox. By comparing the simulation results for the original liver CT to homogenized variants, the effects of heterogeneity were assessed. For the liver case, 1.4 cGy of dose at the Bragg peak generated 50 mPa of pressure (13 cm distal), a 2×  lower amplitude than simulated in a homogeneous water tank. Protoacoustic triangulation of the Bragg peak based on multiple detector measurements resulted in 0.4 mm accuracy for a δ-function proton pulse irradiation of the liver. For the prostate case, higher amplitudes are simulated (92–1004 mPa) for closer detectors (verification to heterogeneous tissue will result in decreased signal amplitudes relative to homogeneous water tank measurements, but accurate range verification is still expected to be possible.

  18. Formal verification of software-based medical devices considering medical guidelines.

    Science.gov (United States)

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  19. Verification Based on Set-Abstraction Using the AIF Framework

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    The AIF framework is a novel method for analyzing advanced security protocols, web services, and APIs, based a new abstract interpretation method. It consists of the specification language AIF and a translation/abstraction processes that produces a set of first-order Horn clauses. These can...

  20. The Application of GeoRSC Based on Domestic Satellite in Field Remote Sensing Anomaly Verification

    Science.gov (United States)

    Gao, Ting; Yang, Min; Han, Haihui; Li, Jianqiang; Yi, Huan

    2016-11-01

    The Geo REC is the digital remote sensing survey system which based on domestic satellites, and by means of it, the thesis carriedy out a remote sensing anomaly verification field application test in Nachitai area of Qinghai. Field test checks the system installation, the stability of the system operation, the efficiency of reading and show the romoate image or vector data, the security of the data management system and the accuracy of BeiDou navigation; through the test data, the author indicated that the hardware and software system could satisfy the remote sensing anomaly verification work in field, which could also could make it convenient forconvenient the workflow of remote sense survey and, improve the work efficiency,. Aat the same time, in the course of the experiment, we also found some shortcomings of the system, and give some suggestions for improvement combineding with the practical work for the system.

  1. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Directory of Open Access Journals (Sweden)

    Raquel Acero

    2016-11-01

    Full Text Available This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs together with a capacitive sensor-based indexed metrology platform (IMP based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  2. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  3. Multimodal human verification using stereo-based 3D inforamtion, IR, and speech

    Science.gov (United States)

    Park, Changhan

    2007-04-01

    In this paper, we propose a personal verification method using 3D face information, infrared (IR), and speech to improve the rate of single biometric authentication. False acceptance rate (FAR) and false rejection rate (FRR) have been a fundamental bottleneck of real-time personal verification. Proposed method uses principal component analysis (PCA) for face recognition and hidden markov model (HMM) for speech recognition based on stereo acquisition system with IR imagery. 3D face information acquires face's depth and distance using a stereo system. The proposed system consists of eye detection, facial pose direction estimation, and PCA modules. An IR image of the human face presents its unique heat-signature and can be used for recognition. IR images use only for decision whether human face or not. It also uses fuzzy logic for the final decision of personal verification. Based on experimental results, the proposed system can reduce FAR which provides that the proposed method overcomes the limitation of single biometric system and provides stable person authentication in real-time.

  4. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  5. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  6. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  7. (Re)Writing Civics in the Digital Age: The Role of Social Media in Student (Dis)Engagement

    Science.gov (United States)

    Portman Daley, Joannah

    2012-01-01

    (Re)Writing Civics in the Digital Age: The Role of Social Media in Student (Dis)Engagement addresses an important gap in the knowledge of civic rhetoric available in Rhetoric and Composition by using qualitative methods to explore the parameters of civic engagement through social media-based digital writing. With funding from URI's Office of…

  8. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  9. A methodology for model-based development and automated verification of software for aerospace systems

    Science.gov (United States)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  10. Development and verification of an agent-based model of opinion leadership.

    Science.gov (United States)

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  11. VERIFICATION OF GRAPHEMES USING NEURAL NETWORKS IN AN HMM­BASED ON­LINE KOREAN HANDWRITING RECOGNITION SYSTEM

    NARCIS (Netherlands)

    So, S.J.; Kim, J.; Kim, J.H.

    2004-01-01

    This paper presents a neural network based verification method in an HMM­based on­line Korean handwriting recognition system. It penalizes unreasonable grapheme hypotheses and complements global and structural information to the HMM­based recognition system, which is intrinsically based on local

  12. Journalism's Rewriting of History in Reporting the Arab Spring

    DEFF Research Database (Denmark)

    Jørndrup, Hanne

    2012-01-01

    and circumstances that put Tunisia and Egypt on the Danish media’s agenda in the year before the Arab revolutions as a starting point. The central point of this comparison is to convey how journalism, while describing contemporary events of The Arab Spring, at the same time rewrites its own prior commentary...... on the region. Rewriting history in this way gives journalism a neutral and unassailable position as observer of events of world-wide importance, but it brings in its train other problems with staying true to both the readers and to unfolding events...

  13. Journalism's Rewriting of History in Reporting the Arab Spring

    DEFF Research Database (Denmark)

    Jørndrup, Hanne

    2012-01-01

    Investigation of journalism’s role as writer and rewriter of the record of political episodes of world importance is central to this article, which takes an empirical approach in choosing the Danish press coverage of The Arab Spring as its starting point. The article analyses how a number...... and circumstances that put Tunisia and Egypt on the Danish media’s agenda in the year before the Arab revolutions as a starting point. The central point of this comparison is to convey how journalism, while describing contemporary events of The Arab Spring, at the same time rewrites its own prior commentary...

  14. MRI-based treatment planning and dose delivery verification for intraocular melanoma brachytherapy.

    Science.gov (United States)

    Zoberi, Jacqueline Esthappan; Garcia-Ramirez, Jose; Hedrick, Samantha; Rodriguez, Vivian; Bertelsman, Carol G; Mackey, Stacie; Hu, Yanle; Gach, H Michael; Rao, P Kumar; Grigsby, Perry W

    2017-08-14

    Episcleral plaque brachytherapy (EPB) planning is conventionally based on approximations of the implant geometry with no volumetric imaging following plaque implantation. We have developed an MRI-based technique for EPB treatment planning and dose delivery verification based on the actual patient-specific geometry. MR images of 6 patients, prescribed 85 Gy over 96 hours from Collaborative Ocular Melanoma Study-based EPB, were acquired before and after implantation. Preimplant and postimplant scans were used to generate "preplans" and "postplans", respectively. In the preplans, a digital plaque model was positioned relative to the tumor, sclera, and nerve. In the postplans, the same plaque model was positioned based on the imaged plaque. Plaque position, point doses, percentage of tumor volume receiving 85 Gy (V100), and dose to 100% of tumor volume (Dmin) were compared between preplans and postplans. All isodose plans were computed using TG-43 formalism with no heterogeneity corrections. Shifts and tilts of the plaque ranged from 1.4 to 8.6 mm and 1.0 to 3.8 mm, respectively. V100 was ≥97% for 4 patients. Dmin for preplans and postplans ranged from 83 to 118 Gy and 45 to 110 Gy, respectively. Point doses for tumor apex and base were all found to decrease from the preimplant to the postimplant plan, with mean differences of 16.7 ± 8.6% and 30.5 ± 11.3%, respectively. By implementing MRI for EPB, we eliminate reliance on approximations of the eye and tumor shape and the assumption of idealized plaque placement. With MRI, one can perform preimplant as well as postimplant imaging, facilitating EPB treatment planning based on the actual patient-specific geometry and dose-delivery verification based on the imaged plaque position. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  15. Weak convergence and uniform normalization in infinitary rewriting

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2010-01-01

    We study infinitary term rewriting systems containing finitely many rules. For these, we show that if a weakly convergent reduction is not strongly convergent, it contains a term that reduces to itself in one step (but the step itself need not be part of the reduction). Using this result, we prov...

  16. Rewriting Citizenship? Civic Education in Costa Rica and Argentina

    Science.gov (United States)

    Suarez, David F.

    2008-01-01

    To what degree are nations "rewriting" citizenship by expanding discussions of human rights, diversity and cultural pluralism in modern civic education, and what explains variation between countries? This study addresses these issues by analysing the intended content of civic education in Costa Rica and Argentina. Over time, civic…

  17. Politics of rewriting: what did Achebe really do?

    African Journals Online (AJOL)

    empire fights back".4 Suggestive of writing back (rewriting) as "fighting back" in the aftennath of the colossal encounter of ... novelty which is part of the theoretical concerns in this essay. The open-ended has ..... and writing English)? The fact is that the Whites attitude that the African is inhu- man is so fixed that the very ...

  18. 77 FR 59581 - Personal Identity Verification, Release and Handling of Restricted Information, Protection of the...

    Science.gov (United States)

    2012-09-28

    ... Personal Identity Verification, Release and Handling of Restricted Information, Protection of the Florida... under the rewrite project. FOR FURTHER INFORMATION CONTACT: Leigh Pomponio, NASA, Office of [email protected] . SUPPLEMENTARY INFORMATION: A. Background NASA published three proposed rules to make...

  19. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  20. In vivo dose verification method in catheter based high dose rate brachytherapy.

    Science.gov (United States)

    Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas

    2017-12-01

    In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was

  1. SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Jinfeng; Cao, Ruifen; Dai, Yumei; Pei, Xi; Hu, Liqin [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui (China); LIN, Hui [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui (China); School of Electronic Science and Application Physics, Hefei University of Technology, Hefei, Anhui (China); Zhang, Jun [University of Science and Technology of China, Hefei, Anhui (China)

    2014-06-01

    Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumor configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and

  2. PLM-based Approach for Design Verification and Validation using Manufacturing Process Knowledge

    Directory of Open Access Journals (Sweden)

    Luis Toussaint

    2010-02-01

    Full Text Available Out of 100 hours of engineering work, only 20 are dedicated to real engineering and 80 are spent on what is considered as routine activities. Readjusting the ratio of innovative vs. routine work is a considerable challenge in the product lifecycle management (PLM strategy. Therefore, the main objective is to develop an approach in order to accelerate routine processes in engineering design. The proposed methodology called FabK consists of capturing manufacturing knowledge and its application towards the design verification and validation of new engineering designs. The approach is implemented into a Web-based PLM prototype and a Computer Aided Design system. A series of experiments from an industrial case study is introduced to provide significant results.

  3. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  4. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  5. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    Science.gov (United States)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  6. Rewriting and suppressing UMLS terms for improved biomedical term identification

    Directory of Open Access Journals (Sweden)

    Hettne Kristina M

    2010-03-01

    Full Text Available Abstract Background Identification of terms is essential for biomedical text mining.. We concentrate here on the use of vocabularies for term identification, specifically the Unified Medical Language System (UMLS. To make the UMLS more suitable for biomedical text mining we implemented and evaluated nine term rewrite and eight term suppression rules. The rules rely on UMLS properties that have been identified in previous work by others, together with an additional set of new properties discovered by our group during our work with the UMLS. Our work complements the earlier work in that we measure the impact on the number of terms identified by the different rules on a MEDLINE corpus. The number of uniquely identified terms and their frequency in MEDLINE were computed before and after applying the rules. The 50 most frequently found terms together with a sample of 100 randomly selected terms were evaluated for every rule. Results Five of the nine rewrite rules were found to generate additional synonyms and spelling variants that correctly corresponded to the meaning of the original terms and seven out of the eight suppression rules were found to suppress only undesired terms. Using the five rewrite rules that passed our evaluation, we were able to identify 1,117,772 new occurrences of 14,784 rewritten terms in MEDLINE. Without the rewriting, we recognized 651,268 terms belonging to 397,414 concepts; with rewriting, we recognized 666,053 terms belonging to 410,823 concepts, which is an increase of 2.8% in the number of terms and an increase of 3.4% in the number of concepts recognized. Using the seven suppression rules, a total of 257,118 undesired terms were suppressed in the UMLS, notably decreasing its size. 7,397 terms were suppressed in the corpus. Conclusions We recommend applying the five rewrite rules and seven suppression rules that passed our evaluation when the UMLS is to be used for biomedical term identification in MEDLINE. A software

  7. Runtime Verification Based on Executable Models: On-the-Fly Matching of Timed Traces

    Directory of Open Access Journals (Sweden)

    Mikhail Chupilko

    2013-03-01

    Full Text Available Runtime verification is checking whether a system execution satisfies or violates a given correctness property. A procedure that automatically, and typically on the fly, verifies conformance of the system's behavior to the specified property is called a monitor. Nowadays, a variety of formalisms are used to express properties on observed behavior of computer systems, and a lot of methods have been proposed to construct monitors. However, it is a frequent situation when advanced formalisms and methods are not needed, because an executable model of the system is available. The original purpose and structure of the model are out of importance; rather what is required is that the system and its model have similar sets of interfaces. In this case, monitoring is carried out as follows. Two "black boxes", the system and its reference model, are executed in parallel and stimulated with the same input sequences; the monitor dynamically captures their output traces and tries to match them. The main problem is that a model is usually more abstract than the real system, both in terms of functionality and timing. Therefore, trace-to-trace matching is not straightforward and allows the system to produce events in different order or even miss some of them. The paper studies on-the-fly conformance relations for timed systems (i.e., systems whose inputs and outputs are distributed along the time axis. It also suggests a practice-oriented methodology for creating and configuring monitors for timed systems based on executable models. The methodology has been successfully applied to a number of industrial projects of simulation-based hardware verification.

  8. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  9. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    Science.gov (United States)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  10. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States); Wang, Yaqi [North Carolina State Univ., Raleigh, NC (United States)

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  11. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  12. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  13. Halochromic Isoquinoline with Mechanochromic Triphenylamine: Smart Fluorescent Material for Rewritable and Self-Erasable Fluorescent Platform.

    Science.gov (United States)

    Hariharan, Palamarneri Sivaraman; Mothi, Ebrahim M; Moon, Dohyun; Anthony, Savarimuthu Philip

    2016-12-07

    Halochromic isoquinoline attached mechanochromic triphenylamine, N-phenyl-N-(4-(quinolin-2-yl)phenyl)benzenamine (PQPBA) and tris(4-(quinolin-2-yl)phenyl)amine (TQPA), smart fluorescent materials exhibit thermo/mechanochromism and tunable solid state fluorescence and their unusual halochromic response in PMMA matrix have been used for fabricating rewritable and self-erasable fluorescent platforms. PQPBA and TQPA showed strong fluorescence in solution (Φf = 0.9290 (PQPBA) and 0.9160 (TQPA)) and moderate solid state fluorescence (Φf = 20 (PQPBA) and 17% (TQPA). Interestingly, they exhibited a rare temperature (0-100 °C) dependent positive fluorescence enhancement via activating radiative vibrational transition. The deaggregation of PQPBA and TQPA in PMMA polymer matrix lead to the enhancement of fluorescence intensity strongly and fabricated strong blue fluorescent thin films (Φf = 58% (PQPBA) and 54% (TQPA). The halochromic isoquinoline has been exploited for demonstrating reversible off-on fluorescence switching by acid (TFA (trifluoroacetic acid)/HCl) and base (NH3) treatment in both solids as well as PMMA thin films. Importantly, rewritable and self-erasable fluorescent platform has been achieved by make use of unusual fluorescence responses of PQPBA/TQPA with TFA/HCl after exposing NH3. Single crystal and powder X-ray diffraction (PXRD) studies provided the insight on the solid-state fluorescence and external stimuli-induced fluorescence changes.

  14. Fingerprint verification for smart-card holders based on an optical image encryption scheme

    Science.gov (United States)

    Suzuki, Hiroyuki; Yamaya, Taiga; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2003-11-01

    Fingerprint verification for smart card holders is one of the methods which are able to identify smart card holders with a high level of security. However, an ingenious implementation is needed to execute it in the embedded processor quickly and safely, because of its computational burden and the limitation of the smart card performance. For this purpose, we propose a hybrid method which is a combination of personal identification number (PIN) verification with a smart card and an optical fingerprint verification method. The result of a preliminary computer simulation to evaluate the proposed system shows that false acceptance rate is completely zero, though false rejection rate is a little inferior to the conventional figerprint verification system.

  15. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  16. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine

    OpenAIRE

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-01-01

    Purpose To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). Methods The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to tho...

  17. Verification methodology manual for SystemVerilog

    CERN Document Server

    Bergeron, Janick; Hunter, Alan

    2006-01-01

    SystemVerilog is a unified language that serves both design and verification engineers by including RTL design constructs, assertions and a rich set of verification constructs. This book is based upon best verification practices by ARM, Synopsys and their customers. It is useful for those involved in the design or verification of a complex chip.

  18. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    Science.gov (United States)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  19. Percentile-based neighborhood precipitation verification and its application to a landfalling tropical storm case with radar data assimilation

    Science.gov (United States)

    Zhu, Kefeng; Yang, Yi; Xue, Ming

    2015-11-01

    The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias. When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.

  20. Simple thermal to thermal face verification method based on local texture descriptors

    Science.gov (United States)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  1. Fusion of hand vein, iris and fingerprint for person identity verification based on Bayesian theory

    Science.gov (United States)

    Li, Xiuyan; Liu, Tiegen; Deng, Shichao; Wang, Yunxin

    2009-11-01

    Biometric identification is an important guarantee for social security. In recent years, as the development of social and economic, the more accuracy and safety of identification are required. The person identity verification systems that use a single biometric appear inherent limitations in accuracy, user acceptance, universality. Limitations of unimodal biometric systems can be overcome by using multimodal biometric systems, which combines the conclusions made by a number of unrelated biometrics indicators. Aiming at the limitations of unimodal biometric identification, a recognition algorithm for multimodal biometric fusion based on hand vein, iris and fingerprint was proposed. To verify person identity, the hand vein images, iris images and fingerprint images were preprocessed firstly. The region of interest (ROI) of hand vein image was obtained and filtered to reduce image noises. The multiresolution analysis theory was utilized to extract the texture information of hand vein. The iris image was preprocessed through iris localization, eyelid detection, image normalization and image enhancement, and then the feature code of iris was extracted from the detail images obtained using wavelet transform. The texture feature information represented fingerprint pattern was extracted after filtering and image enhancement. The Bayesian theorem was employed to realize the fusion at the matching score level and the fusion recognition result was finally obtained. The experimental results were presented, which showed that the recognition performance of the proposed fusion method was obviously higher than that of single biometric recognition algorithm. It had verified the efficiency of the proposed method for biometrics.

  2. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  3. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Science.gov (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  4. Journalism's Rewriting of History in Reporting the Arab Spring

    DEFF Research Database (Denmark)

    Jørndrup, Hanne

    2012-01-01

    of historical references to, in particular, European revolutionary history from Eastern Europe in 1989, are woven into the journalistic descriptions of events in Tunisia and Egypt. But the analysis also reflects on journalism’s own historical precedents in that field. Therefore, this paper takes the topics...... and circumstances that put Tunisia and Egypt on the Danish media’s agenda in the year before the Arab revolutions as a starting point. The central point of this comparison is to convey how journalism, while describing contemporary events of The Arab Spring, at the same time rewrites its own prior commentary...

  5. Convex polyhedral abstractions, specialisation and property-based predicate splitting in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    We present an approach to constrained Horn clause (CHC) verification combining three techniques: abstract interpretation over a domain of convex polyhedra, specialisation of the constraints in CHCs using abstract interpretation of query-answer transformed clauses, and refinement by splitting...... predicates. The purpose of the work is to investigate how analysis and transformation tools developed for constraint logic programs (CLP) can be applied to the Horn clause verification problem. Abstract interpretation over convex polyhedra is capable of deriving sophisticated invariants and when used...... in conjunction with specialisation for propagating constraints it can frequently solve challenging verification problems. This is a contribution in itself, but refinement is needed when it fails, and the question of how to refine convex polyhedral analyses has not been studied much. We present a refinement...

  6. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  7. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  8. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-09-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  9. Addressable configurations of DNA nanostructures for rewritable memory.

    Science.gov (United States)

    Chandrasekaran, Arun Richard; Levchenko, Oksana; Patel, Dhruv S; MacIsaac, Molly; Halvorsen, Ken

    2017-11-02

    DNA serves as nature's information storage molecule, and has been the primary focus of engineered systems for biological computing and data storage. Here we combine recent efforts in DNA self-assembly and toehold-mediated strand displacement to develop a rewritable multi-bit DNA memory system. The system operates by encoding information in distinct and reversible conformations of a DNA nanoswitch and decoding by gel electrophoresis. We demonstrate a 5-bit system capable of writing, erasing, and rewriting binary representations of alphanumeric symbols, as well as compatibility with 'OR' and 'AND' logic operations. Our strategy is simple to implement, requiring only a single mixing step at room temperature for each operation and standard gel electrophoresis to read the data. We envision such systems could find use in covert product labeling and barcoding, as well as secure messaging and authentication when combined with previously developed encryption strategies. Ultimately, this type of memory has exciting potential in biomedical sciences as data storage can be coupled to sensing of biological molecules. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Important Points on Rewriting Old Stories for Children

    Directory of Open Access Journals (Sweden)

    Maryam Jalali

    2015-12-01

    Full Text Available Various works with different genres can be applied to rewrite classic literature. Versified stories, unreal fictions and real stories are identified as texts which can be rewritten in a new texture. The first rewritten texts have been published long ago. The authors and researchers have been more knowledgeable in this regard particularly in the last two decades in order to carry out a remarkable development in this field. It should be considered that there are talented authors who realize the capability of these texts while analyzing them. As a matter of fact, this capability is referred to being recreated and turning into a new texture and structure. It seems as if these myths are supposed to step into a new universe where it is founded on written speech. This recreation is emerged because the society needs to take a look at its past in a modern version because modern children are going to read them. Keywords: Rewriting, Old Story, children’s literature, contemporary

  11. A feasibility study of independent verification of dose calculation for Vero4DRT using a Clarkson-based algorithm.

    Science.gov (United States)

    Yamashita, Mikiko; Takahashi, Ryo; Kokubo, Masaki; Takayama, Kenji; Tanabe, Hiroaki; Sueoka, Masaki; Ishii, Masao; Tachibana, Hidenobu

    2018-01-25

    Dose verification for a gimbal-mounted image-guided radiotherapy system, Vero4DRT (Mitsubishi Heavy Industries Ltd., Tokyo, Japan) is usually carried out by pretreatment measurement. Independent verification calculations using Monte Carlo methods for Vero4DRT have been published. As the Clarkson method is faster and easier to use than measurement and Monte Carlo methods, we evaluated the accuracy of an independent calculation verification program and its feasibility as a secondary check for Vero4DRT. Computed tomography (CT)-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients' treatment plans were collected in our institute. The treatments were performed using conventional irradiation for lung and prostate, 3-dimensional (3D) conformal stereotactic body radiotherapy (SBRT) for the lung, and intensity-modulated radiation therapy (IMRT) for the prostate. Differences between the treatment planning system (TPS) and the Clarkson-based independent dose verification software were computed, and confidence limits (CLs, mean ± 2 standard deviation %) for Vero4DRT were compared with the CLs for the C-arms linear accelerators in the previous study. The results of the CLs, the conventional irradiation, SBRT, and IMRT showed 2.2 ± 3.5% (CL of the C-arms linear accelerators: 2.4 ± 5.3%), 1.1 ± 1.7% (-0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%), and -0.5 ± 2.5% (-0.1 ± 3.6%) differences, respectively. The dose disagreement between the TPS and CT-based independent dose verification software was less than the 5% action level of American Association of Physicists in Medicine (AAPM) Task Group 114 (TG114). The CLs for the gimbal-mounted Vero4DRT were similar to the deviations for C-arms linear accelerators. Copyright © 2017 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  12. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...

  13. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    Science.gov (United States)

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  14. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    Science.gov (United States)

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  15. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  16. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    Science.gov (United States)

    2010-05-27

    verifying analyses and allow users to understand how the tool reached its conclusions. Bandera [30] is a system that extracts models from Java source for...verification by a model checker and maps verifier outputs back to the original source code. Bandera represents, simi- lar to drop-sea, an effort to...establish an effective architecture for assurance but is focused on model checking rather than program analysis. Similar to our work, Bandera , and other

  17. Formal Verification of Firmware-Based System-on-Chip Modules

    OpenAIRE

    Villarraga, Carlos

    2017-01-01

    In current practices of system-on-chip (SoC) design a trend can be observed to integrate more and more low-level software components into the system hardware at different levels of granularity. The implementation of important control functions and communication structures is frequently shifted from the SoC’s hardware into its firmware. As a result, the tight coupling of hardware and software at a low level of granularity raises substantial verification challenges since the conventional practi...

  18. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  19. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    Directory of Open Access Journals (Sweden)

    Grzegorz Zwierzchowski

    2016-08-01

    Full Text Available Purpose: Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm, accurate tissues segmentation, and the structure’s elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods : Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results : Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm. Conclusions : Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  20. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  1. Modular Implementation of Programming Languages and a Partial-Order Approach to Infinitary Rewriting

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2012-01-01

    In this dissertation we investigate two independent areas of research. In the first part, we develop techniques for implementing programming languages in a modular fashion. Within this problem domain, we focus on operations on typed abstract syntax trees with the goal of developing a framework...... that facilitates the definition, manipulation and composition of such operations. The result of our work is a comprehensive combinator library that provides these facilities. What sets our approach apart is the use of recursion schemes derived from tree automata in order to implement operations on abstract syntax...... trees. The second part is concerned with infinitary rewriting, a field that studies transfinite rewrite sequences. We extend the established theory of infinitary rewriting in two ways: (1) a novel approach to convergence in infinitary rewriting that replaces convergence in a metric space with the limit...

  2. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P [Netherlands Cancer Institute - Antoni van Leeuwenhoek, Amsterdam, Noord-Holland (Netherlands)

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.

  3. New analysis tools and processes for mask repair verification and defect disposition based on AIMS images

    Science.gov (United States)

    Richter, Rigo; Poortinga, Eric; Scheruebl, Thomas

    2009-10-01

    Using AIMSTM to qualify repairs of defects on photomasks is an industry standard. AIMSTM images match the lithographic imaging performance without the need for wafer prints. Utilization of this capability by photomask manufacturers has risen due to the increased complexity of layouts incorporating RET and phase shift technologies. Tighter specifications by end-users have pushed AIMSTM analysis to now include CD performance results in addition to the traditional intensity performance results. Discussed is a new Repair Verification system for automated analysis of AIMSTM images. Newly designed user interfaces and algorithms guide users through predefined analysis routines as to minimize errors. There are two main routines discussed, one allowing multiple reference sites along with a test/defect site within a single image of repeating features. The second routine compares a test/defect measurement image with a reference measurement image. Three evaluation methods possible with the compared images are discussed in the context of providing thorough analysis capability. This paper highlights new functionality for AIMSTM analysis. Using structured analysis processes and innovative analysis tools leads to a highly efficient and more reliable result reporting of repair verification analysis.

  4. Acrylonitrile Butadiene Styrene (ABS) plastic based low cost tissue equivalent phantom for verification dosimetry in IMRT.

    Science.gov (United States)

    Kumar, Rajesh; Sharma, S D; Deshpande, Sudesh; Ghadi, Yogesh; Shaiju, V S; Amols, H I; Mayya, Y S

    2009-12-17

    A novel IMRT phantom was designed and fabricated using Acrylonitrile Butadiene Styrene (ABS) plastic. Physical properties of ABS plastic related to radiation interaction and dosimetry were compared with commonly available phantom materials for dose measurements in radiotherapy. The ABS IMRT phantom has provisions to hold various types of detectors such as ion chambers, radiographic/radiochromic films, TLDs, MOSFETs, and gel dosimeters. The measurements related to pre-treatment dose verification in IMRT of carcinoma prostate were carried out using ABS and Scanditronics-Wellhoffer RW3 IMRT phantoms for five different cases. Point dose data were acquired using ionization chamber and TLD discs while Gafchromic EBT and radiographic EDR2 films were used for generating 2-D dose distributions. Treatment planning system (TPS) calculated and measured doses in ABS plastic and RW3 IMRT phantom were in agreement within +/-2%. The dose values at a point in a given patient acquired using ABS and RW3 phantoms were found comparable within 1%. Fluence maps and dose distributions of these patients generated by TPS and measured in ABS IMRT phantom were also found comparable both numerically and spatially. This study indicates that ABS plastic IMRT phantom is a tissue equivalent phantom and dosimetrically it is similar to solid/plastic water IMRT phantoms. Though this material is demonstrated for IMRT dose verification but it can be used as a tissue equivalent phantom material for other dosimetry purposes in radiotherapy.

  5. Algebraic verification of a distributed summation algorithm

    OpenAIRE

    Groote, Jan Friso; Springintveld, J.G.

    1996-01-01

    textabstractIn this note we present an algebraic verification of Segall's Propagation of Information with Feedback (PIF) algorithm. This algorithm serves as a nice benchmark for verification exercises (see [2, 13, 8]). The verification is based on the methodology presented in [7] and demonstrates its applicability to distributed algorithms.

  6. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  7. A Simple Visual Ethanol Biosensor Based on Alcohol Oxidase Immobilized onto Polyaniline Film for Halal Verification of Fermented Beverage Samples

    Directory of Open Access Journals (Sweden)

    Bambang Kuswandi

    2014-01-01

    Full Text Available A simple visual ethanol biosensor based on alcohol oxidase (AOX immobilised onto polyaniline (PANI film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min. The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ to study the characteristics of the biosensor’s response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.

  8. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    Science.gov (United States)

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  9. A novel method for sub-arc VMAT dose delivery verification based on portal dosimetry with an EPID.

    Science.gov (United States)

    Cools, Ruud A M; Dirkx, Maarten L P; Heijmen, Ben J M

    2017-11-01

    The EPID-based sub-arc verification of VMAT dose delivery requires synchronization of the acquired electronic portal images (EPIs) with the VMAT delivery, that is, establishment of the start- and stop-MU of the acquired images. To realize this, published synchronization methods propose the use of logging features of the linac or dedicated hardware solutions. In this study, we developed a novel, software-based synchronization method that only uses information inherently available in the acquired images. The EPIs are continuously acquired during pretreatment VMAT delivery and converted into Portal Dose Images (PDIs). Sub-arcs of approximately 10 MU are then defined by combining groups of sequentially acquired PDIs. The start- and stop-MUs of measured sub-arcs are established in a synchronization procedure, using only dosimetric information in measured and predicted PDIs. Sub-arc verification of a VMAT dose delivery is based on comparison of measured sub-arc PDIs with synchronized, predicted sub-arc PDIs, using γ-analyses. To assess the accuracy of this new method, measured and predicted PDIs were compared for 20 clinically applied VMAT prostate cancer plans. The sensitivity of the method for detection of delivery errors was investigated using VMAT deliveries with intentionally inserted, small perturbations (25 error scenarios; leaf gap deviations ≤ 1.5 mm, leaf motion stops during ≤ 15 MU, linac output error ≤ 2%). For the 20 plans, the average failed pixel rates (FPR) for full-arc and sub-arc dose QA were 0.36% ± 0.26% (1 SD) and 0.64% ± 0.88%, based on 2%/2 mm and 3%/3 mm γ-analyses, respectively. Small systematic perturbations of up to 1% output error and 1 mm leaf offset were detected using full-arc QA. Sub-arc QA was able to detect positioning errors in three leaves only during approximately 20 MU and small dose delivery errors during approximately 40 MU. In an ROC analysis, the area under the curve (AUC) for the combined full-arc/sub-arc approach was

  10. Diffusion-weighted MRI for verification of electroporation-based treatments

    DEFF Research Database (Denmark)

    Mahmood, Faisal; Hansen, Rasmus H; Agerholm-Larsen, Birgit

    2011-01-01

    such a tissue reaction represents a great clinical benefit since, in case of target miss, retreatment can be performed immediately. We propose diffusion-weighted magnetic resonance imaging (DW-MRI) as a method to monitor EP tissue, using the concept of the apparent diffusion coefficient (ADC). We hypothesize...... that the plasma membrane permeabilization induced by EP changes the ADC, suggesting that DW-MRI constitutes a noninvasive and quick means of EP verification. In this study we performed in vivo EP in rat brains, followed by DW-MRI using a clinical MRI scanner. We found a pulse amplitude-dependent increase...... in the ADC following EP, indicating that (1) DW-MRI is sensitive to the EP-induced changes and (2) the observed changes in ADC are indeed due to the applied electric field....

  11. Stability-Based Hybrid Automata for Safety Verification Using Continuation Methods

    Science.gov (United States)

    Uth, Peter

    The rapid development of increasingly autonomous systems has advanced the challenge of safety assurance beyond the capabilities of existing methods. Therefore, new means of verification and validation are required to ensure the safe operation of emerging systems. Numerical continuation characterizes system behavior as parameters are varied and can be used to facilitate a bifurcation analysis, where equilibria and their stability properties are identified. This paper introduces hybrid stability automata--system models constructed using numerical continuation that capture stability properties within a series of dynamic modes. These automata readily support safety analyses by explicitly defining stable, i.e. safe, regions of the operational envelope. The processes to create hybrid stability automata for one-dimensional and multi-dimensional systems are discussed and prototypical examples are presented. Example safety analyses using hybrid stability automata are demonstrated on the space shuttle reentry dynamics.

  12. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    given infeasible traces have been eliminated, using a recent optimised algorithm for tree automata determinisation. We also show how we can introduce disjunctive abstractions selectively by splitting states in the tree automaton. The approach is independent of the abstract domain and constraint theory......In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... rather than just transition systems; secondly, we show how algorithms manipulating tree automata interact with abstract interpretations, establishing progress in refinement and generating refined clauses that eliminate causes of imprecision. We show how to derive a refined set of Horn clauses in which...

  13. A filtering approach based on Gaussian-powerlaw convolutions for local PET verification of proton radiotherapy.

    Science.gov (United States)

    Parodi, Katia; Bortfeld, Thomas

    2006-04-21

    Because proton beams activate positron emitters in patients, positron emission tomography (PET) has the potential to play a unique role in the in vivo verification of proton radiotherapy. Unfortunately, the PET image is not directly proportional to the delivered radiation dose distribution. Current treatment verification strategies using PET therefore compare the actual PET image with full-blown Monte Carlo simulations of the PET signal. In this paper, we describe a simpler and more direct way to reconstruct the expected PET signal from the local radiation dose distribution near the distal fall-off region, which is calculated by the treatment planning programme. Under reasonable assumptions, the PET image can be described as a convolution of the dose distribution with a filter function. We develop a formalism to derive the filter function analytically. The main concept is the introduction of 'Q' functions defined as the convolution of a Gaussian with a powerlaw function. Special Q functions are the Gaussian itself and the error function. The convolution of two Q functions is another Q function. By fitting elementary dose distributions and their corresponding PET signals with Q functions, we derive the Q function approximation of the filter. The new filtering method has been validated through comparisons with Monte Carlo calculations and, in one case, with measured data. While the basic concept is developed under idealized conditions assuming that the absorbing medium is homogeneous near the distal fall-off region, a generalization to inhomogeneous situations is also described. As a result, the method can determine the distal fall-off region of the PET signal, and consequently the range of the proton beam, with millimetre accuracy. Quantification of the produced activity is possible. In conclusion, the PET activity resulting from a proton beam treatment can be determined by locally filtering the dose distribution as obtained from the treatment planning system. The

  14. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  15. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  16. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. Rewriting Grocio. The innate natural law of Diego Vincenzo Vidania

    Directory of Open Access Journals (Sweden)

    José María Iñurritegui Rodríguez

    2015-04-01

    Full Text Available This work attends to a singular manuscript composed by the Aragonian jurist Diego Vincenzo Vidania in 1712 and entitled The innate natural law in the minds of the men and their effects. The paper maintains that its pages contain one of the earliest —and certainly more original— attempts rehearsed from Hispanic cultural positions in the blocking operation against the discourse of the rationalistic natural-law school gestated in the 17th: the veiled proceeding of rewriting the William Grotius’ De juris naturalis principiis enchiridion, with the aim to assemble it with a possible reading of his brother Hugo —De jure belli Catholic ac pacis— that could well be used as a component of containment against the irruption of a new perception of the human sociability released from any religious conception. And this paper further claims, by restoring the text to its context, that the assignment of the manuscript's matter, the vocabulary used for its writing, the materials it was served by, and the stimulus that infused its own redaction, are inseparable from a genuine chapter of the Querelle des Anciens et Modernes which, in that particular moment, was elucidating in a kingdom of Naples which Diego Vidania was linked as Cappellano maggiore and Prefetto of its University.

  18. Rewriting the Metabolic Blueprint: Advances in Pathway Diversification in Microorganisms

    Directory of Open Access Journals (Sweden)

    Gazi Sakir Hossain

    2018-02-01

    Full Text Available Living organisms have evolved over millions of years to fine tune their metabolism to create efficient pathways for producing metabolites necessary for their survival. Advancement in the field of synthetic biology has enabled the exploitation of these metabolic pathways for the production of desired compounds by creating microbial cell factories through metabolic engineering, thus providing sustainable routes to obtain value-added chemicals. Following the past success in metabolic engineering, there is increasing interest in diversifying natural metabolic pathways to construct non-natural biosynthesis routes, thereby creating possibilities for producing novel valuable compounds that are non-natural or without elucidated biosynthesis pathways. Thus, the range of chemicals that can be produced by biological systems can be expanded to meet the demands of industries for compounds such as plastic precursors and new antibiotics, most of which can only be obtained through chemical synthesis currently. Herein, we review and discuss novel strategies that have been developed to rewrite natural metabolic blueprints in a bid to broaden the chemical repertoire achievable in microorganisms. This review aims to provide insights on recent approaches taken to open new avenues for achieving biochemical production that are beyond currently available inventions.

  19. Anachronism and the rewriting of history: the South Africa case

    Directory of Open Access Journals (Sweden)

    Georgi Verbeeck

    2006-04-01

    Full Text Available The use and abuse of anachronism is often seen as the quintessence of the writing of history. Historians tend to conceive it as the hardcore of their métier to avoid anachronism. It designates a confusion in order of time, especially the mistake of placing an event, attitude, or circumstance too early. The awareness of historical anachronism is omnipresent in times of a radical rewriting of history, in particular as a result of political transformation. History reflects the needs and ambitions of a political context, and the sense of what is deemed historically significant does not remain unattached hereby. Chronology and anachronism are essential to particular conceptions of history, and if history is in a process of being rewritten, they are the first items to be addressed by the defenders of the old system and the advocates of a new discourse. In political debates on the use or abuse of history anachronism is often seen as ultimate proof of the (un-reliability of new insights and conceptions. As anachronism is defined as a way of transferring contemporary sets of values, assumptions and interpretative categories, every political reorientation inevitably provokes a discussion on that level. If a ‘new nation’ is in search of a ‘new past’, a new reflection on the basic categories of historical thinking becomes necessary. The changing discourses in South African historiography since the end of Apartheid serve here as an illuminative example.

  20. Rewritable three-dimensional holographic data storage via optical forces

    Energy Technology Data Exchange (ETDEWEB)

    Yetisen, Ali K., E-mail: ayetisen@mgh.harvard.edu [Harvard Medical School and Wellman Center for Photomedicine, Massachusetts General Hospital, 65 Landsdowne Street, Cambridge, Massachusetts 02139 (United States); Harvard-MIT Division of Health Sciences and Technology, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States); Montelongo, Yunuen [Department of Chemistry, Imperial College London, South Kensington Campus, London SW7 2AZ (United Kingdom); Butt, Haider [Nanotechnology Laboratory, School of Engineering Sciences, University of Birmingham, Birmingham B15 2TT (United Kingdom)

    2016-08-08

    The development of nanostructures that can be reversibly arranged and assembled into 3D patterns may enable optical tunability. However, current dynamic recording materials such as photorefractive polymers cannot be used to store information permanently while also retaining configurability. Here, we describe the synthesis and optimization of a silver nanoparticle doped poly(2-hydroxyethyl methacrylate-co-methacrylic acid) recording medium for reversibly recording 3D holograms. We theoretically and experimentally demonstrate organizing nanoparticles into 3D assemblies in the recording medium using optical forces produced by the gradients of standing waves. The nanoparticles in the recording medium are organized by multiple nanosecond laser pulses to produce reconfigurable slanted multilayer structures. We demonstrate the capability of producing rewritable optical elements such as multilayer Bragg diffraction gratings, 1D photonic crystals, and 3D multiplexed optical gratings. We also show that 3D virtual holograms can be reversibly recorded. This recording strategy may have applications in reconfigurable optical elements, data storage devices, and dynamic holographic displays.

  1. The 3+1 Problem as a String Rewriting System

    Directory of Open Access Journals (Sweden)

    Joseph Sinyor

    2010-01-01

    Full Text Available The 3+1 problem can be viewed, starting with the binary form for any ∈, as a string of “runs” of 1s and 0s, using methodology introduced by Błażewicz and Pettorossi in 1983. A simple system of two unary operators rewrites the length of each run, so that each new string represents the next odd integer on the 3+1 path. This approach enables the conjecture to be recast as two assertions. (I Every odd ∈ lies on a distinct 3+1 trajectory between two Mersenne numbers (2−1 or their equivalents, in the sense that every integer of the form (4+1 with being odd is equivalent to because both yield the same successor. (II If (2−1→(2−1 for any ,,>0, <; that is, the 3+1 function expressed as a map of 's is monotonically decreasing, thereby ensuring that the function terminates for every integer.

  2. A Graph Rewriting Approach for Transformational Design of Digital Systems

    NARCIS (Netherlands)

    Huijs, C.

    1996-01-01

    Transformational design integrates design and verification. It combines “correctness by construction” and design creativity by the use of pre-proven behaviour preserving transformations as design steps. The formal aspects of this methodology are hidden in the transformations. A constraint is the

  3. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  4. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology

    Science.gov (United States)

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  5. Design Considerations and Experimental Verification of a Rail Brake Armature Based on Linear Induction Motor Technology

    Science.gov (United States)

    Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo

    This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.

  6. Foxing the child : the cultural transmission of pedagogical norms and values in Dutch rewritings of literary classics for children 1850-1950

    NARCIS (Netherlands)

    Parlevliet, Sanne

    2012-01-01

    This article examines the reciprocity between children's literature and educational ideals in Dutch rewritings of international literary classics published for children between 1850 and 1950. It analyses the assumed pedagogical power of rewritings of international literary classics for children from

  7. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  8. Graph-based Operational Semantics of a Lazy Functional Languages

    DEFF Research Database (Denmark)

    Rose, Kristoffer Høgsbro

    1992-01-01

    Presents Graph Operational Semantics (GOS): a semantic specification formalism based on structural operational semantics and term graph rewriting. Demonstrates the method by specifying the dynamic ...

  9. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  10. Like Water for Chocolate: The Rewriting of the Female Experience and Its Parallels in Philippine History

    OpenAIRE

    Marikit Tara Alto Uychoco

    2012-01-01

    This article focuses on Laura Esquivel’s Like Water for Chocolate and reads the novel using the literary theories of the “new mestiza,” postcolonial theories, feminist theories, and historiographic metafiction. It seeks to find out how this novel rewrites the female experience of the Mexican Revolution, and the various techniques used in the rewriting of history. It reads the novel from a “new mestiza” feminist perspective, which enables the Filipina reader to find commonalities in the Mexic...

  11. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    1 and MARATHON 2 output. Due to a limitation of MARATHON 2, specifically the inability to extend or otherwise address limitations in ProModel, MAJ ...2010-2012 rewrite of the initial limited 2010 VBA port of MARATHON 2 by (then) MAJ Alex MacCalman. M3 was developed and used in studies from the time

  12. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    NARCIS (Netherlands)

    Visser, Ruurd; Wauben, David J. L.; de Groot, Martijn; Steenbakkers, Roel J. H. M.; Bijl, Henk P.; Godart, Jeremy; van t Veld, Aart; Langendijk, Johannes A.; Korevaar, Erik W.

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for

  13. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-02-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes.Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance.Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies.Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software.Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance.Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At the

  14. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  15. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow....... It is shown that dual decomposition can be applied on the problem of generating barrier certificates, resulting in a compositional formulation of the safety verification problem. This makes the barrier certificate method applicable to the verification of high dimensional systems, but at the cost...

  16. Verification of the new detection method for irradiated spices based on microbial survival by collaborative blind trial

    Energy Technology Data Exchange (ETDEWEB)

    Miyahara, M. [National Institute of Health Sciences, 1-18-1 Kamiyoga, Setagaya-Ku, Tokyo (Japan)], E-mail: mfuruta@b.s.osakafu-u.ac.jp; Furuta, M. [Osaka Prefecture University, 1-2 Gakuen-Cho, Naka-ku, Sakai, 599-8570 Osaka (Japan); Takekawa, T. [Nuclear Fuel Industries Ltd., 950-1 Asashiro-Nishi, Kumatori-Cho, Sennan-Gun, Osaka (Japan); Oda, S. [Japan Food Research Laboratories, 52-1 Motoyoyogi-Cho, Sibuya-Ku, Tokyo (Japan); Koshikawa, T. [Japan Radio Isotope Association, 121-19 Toriino, Koka, Shiga (Japan); Akiba, T. [Japan Food Hygiene Association, 2-5-47 Tadao, Machida, Tokyo (Japan); Mori, T. [Tokyo Kenbikyo-In Foundation, 44-1 Nihonbashi, Hakozaki-Cho, Chuo-Ku, Tokyo (Japan); Mimura, T. [Japan Oilstuff Inspector' s Corporation, 26-1 Kaigandori 5-Chome, Naka-Ku, Yokohama (Japan); Sawada, C. [Japan Frozen Foods Inspection Corp., 2-13-45 Fukuura, Kanazawa-Ku, Yokohama (Japan); Yamaguchi, T. [Japan Electron Beam Irradiation Service Co., Ltd., 4-16 Midorigahara, Tukuba, Ibaraki (Japan); Nishioka, S. [Mycotoxin Inspection Corp., 15 Daikokufuto, Turumi-Ku, Yokohama (Japan); Tada, M. [Chugoku Gakuen University, 83 Niwase, Okayama (Japan)

    2009-07-15

    An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 deg. C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.

  17. Verification of the new detection method for irradiated spices based on microbial survival by collaborative blind trial

    Science.gov (United States)

    Miyahara, M.; Furuta, M.; Takekawa, T.; Oda, S.; Koshikawa, T.; Akiba, T.; Mori, T.; Mimura, T.; Sawada, C.; Yamaguchi, T.; Nishioka, S.; Tada, M.

    2009-07-01

    An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 °C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.

  18. Systems, methods and apparatus for verification of knowledge-based systems

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  19. Re/Writing the Subject: A Contribution to Post-Structuralist Theory in Mathematics Education

    Science.gov (United States)

    Roth, Wolff-Michael

    2012-01-01

    This text, occasioned by a critical reading of "Mathematics Education and Subjectivity" (Brown, "2011") and constituting a response to the book, aims at contributing to the building of (post-structuralist) theory in mathematics education. Its purpose was to re/write two major positions that "Mathematics Education and Subjectivity" articulates:…

  20. Rewritable Painting Realized from Ambient-Sensitive Fluorescence of ZnO Nanoparticles

    Science.gov (United States)

    Liu, Kai-Kai; Shan, Chong-Xin; He, Gao-Hang; Wang, Ruo-Qiu; Dong, Lin; Shen, De-Zhen

    2017-02-01

    Paper, as one of the most important information carriers, has contributed to the development and transmission of human civilization greatly. Meanwhile, a serious problem of environmental sustainable development caused by the production and utilization of paper has been resulted to modern society. Therefore, a simple and green route is urgently demanded to realize rewritable painting on paper. Herein, a simple route to rewritable painting on copy paper has been demonstrated by using eco-friendly ZnO nanoparticles (NPs) as fluorescent ink, and vinegar and soda that are frequently used in kitchen as erasing and neutralizing agents. Words or patterns written using the ZnO NPs as ink can be erased by vinegar vapour within five seconds, and after a neutralizing process in the ambient of soda vapour, the paper can be used for writing again. It is worth noting that the resolution and precision of the patterns produced via the above route degrade little after ten rewriting cycles, and the quality of the patterns produced using the ZnO NPs as ink fades little after being storage for several months, which promises the versatile potential applications of the rewriting route proposed in this paper.

  1. A Female Interrogative Reader: The Adolescent Jane Austen Reads and Rewrites (His)tory.

    Science.gov (United States)

    Reid-Walsh, Jacqueline

    1992-01-01

    Argues that Jane Austen's unpublished juvenile work "The History of England" has considerable relevance to twentieth-century high-school English classrooms. Notes that the work humorously shows the gender bias of traditional history texts because it is a "woman-centered" rewriting. (RS)

  2. Injecting Formal Verification in FMI-based Co-Simulations of Cyber-Physical Systems

    DEFF Research Database (Denmark)

    Couto, Luis Diogo; Basagiannis, Stylianos; Ridouane, El Hassan

    2017-01-01

    This publication is part of the Horizon 2020 project: Integrated Tool chain for model-based design of CPSs (INTO-CPS), project/GA number 644047.......This publication is part of the Horizon 2020 project: Integrated Tool chain for model-based design of CPSs (INTO-CPS), project/GA number 644047....

  3. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  4. Experimental Verification of the Strain Non-Uniformity Index (SNI) based Failure Prediction

    Science.gov (United States)

    Dhumal, D. A.; Kulkarni, Pratik; Date, P. P.; Nandedkar, V. M.

    2016-08-01

    Formability of the sheet metal depends upon the uniformity of strain distribution, which depends on material properties, tooling and process parameters. Nakazima Test was conducted to study the strain distribution and establish the forming limits of AA 6016. The experimental conditions were simulated using AUTOFORM 5.2 Plus software and the failure predicted using the SNI based methodology. The failure predictions were correlated with the state of the experimentally deformed Nakazima samples, and also with the FLD based forming limits. The failure prediction from the SNI based methodology was found to correlate well with the state of the experimental Nakazima sample.

  5. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  6. Design and Verification of Behaviour-Based Systems Realising Task Sequences

    OpenAIRE

    Armbrust, Christopher

    2015-01-01

    Since their invention in the 1980s, behaviour-based systems have become very popular among roboticists. Their component-based nature facilitates the distributed implementation of systems, fosters reuse, and allows for early testing and integration. However, the distributed approach necessitates the interconnection of many components into a network in order to realise complex functionalities. This network is crucial to the correct operation of the robotic system. There are few sound design tec...

  7. Formal Verification of Complex Systems based on SysML Functional Requirements

    Science.gov (United States)

    2014-12-23

    are the most critical in ensuring that the designed system satisfies its safety requirements ( Tumer , Stone, & Bell, 2003; Stone, Tumer , & Stock, 2005...Kurtoglu & Tumer , 2008; Tumer & Smidts, 2011), this paper aims at addressing this challenge using the system-oriented SysML- based modeling approach...Journal of Engineering Design, 20(1), 83–104. Kurtoglu, T., & Tumer , I. Y. (2008). A graph-based fault iden- tification and propagation framework for

  8. Phthalocyanine molecules with extremely strong two-photon absorption for 3D rewritable optical information storage

    Science.gov (United States)

    Drobizhev, M.; Makarov, N. S.; Rebane, A.; Wolleb, H.; Spahni, H.

    2006-08-01

    Phthalocyanines (Pcs) show exceptional stability against high temperatures (up to 900°C, for certain metallophthalocyanines), harsh chemical environments (strong acids and bases), γ-radiation (up to 100 MRad) and neutron radiation (up to 10 19 thermal neutrons/cm2). On the other hand, Pcs exhibit a number of unique physical properties, including semi-conductivity, photoconductivity, large linear and nonlinear optical coefficients, and the ability of photo-switch between two different forms, in case of non-symmetrical metal-free Pcs. This has led to an advancement of phthalocyanine-based prototype field-effect transistors, gas- and photo-sensors, solar cells, optical power limiters, and optical memory devices (CDs). For increasing the capacity of carriers of information, it has been suggested to use the effect of simultaneous two-photon absorption (2PA), which can allow for writing and reading information in many layers, thus resulting in Terabyte (TB) disks. Our estimation of the signal-to-noise ratio shows, however, that for fast (MB/s) processing, molecular 2PA cross section must be extremely large, σ II > 10 3 - 10 4 GM (1GM = 10 -50 cm 4 s), which has not been achieved yet in any photochromic material. In this paper we demonstrate, for the first time, that some specially designed non-symmetric metal-free phthahlocyanines are almost ideally suited for TB rewritable memory due to their extremely high, resonantly enhanced, 2PA cross section (~ 104 GM) in near-IR region and their intrinsic ability of reversible photo-tautomerization at lowered (~ 100 K) temperatures. We discuss how the special technical specifications, such as short pulse laser excitation and lowered working temperature, can be satisfied for space and terrestrial application.

  9. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    Science.gov (United States)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  10. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  11. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. Copyright © 2015 by The American Society for Pharmacology and Experimental Therapeutics.

  12. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  13. Verification of Emulated Channels in Multi-Probe Based MIMO OTA Testing Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum

    2013-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  14. Measurement Verification of Plane Wave Synthesis Technique Based on Multi-probe MIMO-OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum

    2012-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  15. Compositional Verification of Knowledge-Based Systems: a Case Study for Diagnostic Reasoning

    NARCIS (Netherlands)

    Cornelissen, F.; Treur, J.; Jonker, C.M.

    2001-01-01

    When designing complex knowledge-based systems, it is often hard to guarantee that the specification of a system that has been designed actually fulfills the needs, i.e., whether it satisfies the design requirements. Especially for critical applications, for example in aerospace domains, there is a

  16. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...

  17. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    DEFF Research Database (Denmark)

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser-machine...

  18. Performance verification of an epithermal neutron flux monitor using accelerator-based BNCT neutron sources

    Science.gov (United States)

    Guan, X.; Murata, I.; Wang, T.

    2017-09-01

    The performance of an epithermal neutron flux monitor developed for boron neutron capture therapy (BNCT) is verified by Monte Carlo simulations using accelerator-based neutron sources (ABNSs). The results indicate that the developed epithermal neutron flux monitor works well and it can be efficiently used in practical applications to measure the epithermal neutron fluxes of ABNSs in a high accuracy.

  19. Marine induction studies based on sea surface scalar magnetic field measurements. A concept and its verification

    Science.gov (United States)

    Kuvshinov, A. V.; Poedjono, B.; Matzka, J.; Olsen, N.; Pai, S.; Samrock, F.

    2013-12-01

    Most marine EM studies are based on sea-bottom measurements which are expensive and logistically demanding. We propose a low-cost and easy-to-deploy magnetic survey concept which exploits sea surface measurements. It is assumed that the exciting source can be described by a plane wave. The concept is based on responses that relate variations of the scalar magnetic field at the survey sites with variations of the horizontal magnetic field at a base site. It can be shown that these scalar responses are a mixture of standard tipper responses and elements of the horizontal magnetic tensor and thus can be used to probe the electrical conductivity of the subsoil. This opens an avenue for sea-surface induction studies which so far was believed very difficult to conduct if conventional approaches based on vector measurements are invoked. We perform 3-D realistic model studies where the target region was Oahu Island and its surroundings, and USGS operated Honolulu geomagnetic observatory was chosen as the base site. We compare the predicted responses with the responses estimated from the scalar data collected at a few locations around Oahu Island by the unmanned, autonomous, wave and solar powered 'Wave Glider' developed and operated by Liquid Robotics Oil and Gas/Schlumberger. The marine robots observation platform is equipped with a tow Overhauser magnetometer (validated by USGS). The studies show an encouraging agreement between predictions and experiment in both components of the scalar response at all locations and we consider this as a proof of the suggested concept.

  20. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  1. Specification, Synthesis, and Verification of Software-based Control Protocols for Fault-Tolerant Space Systems

    Science.gov (United States)

    2016-08-16

    distribution is unlimited. ii ACKNOWLEDGMENTS This material is based on research sponsored by Air Force Research Laboratory under agreement number...evaluated by the classifier C to produce the class label b, which depends on the task to be completed. Figure 2. A Classifier-in-the-Loop System...human scalp. Using only EEG has the benefit of requiring the user to only wear one piece of headgear to allow interaction with the device. However

  2. Channel Verification Results for the SCME models in a Multi-Probe Based MIMO OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; S. Ashta, Jagjit

    2013-01-01

    MIMO OTA testing methodologies are being intensively investigated by CTIA and 3GPP, where various MIMO test methods have been proposed which vary widely in how they emulate the propagation channels. Inter-lab/inter-technique OTA performance comparison testing for MIMO devices is ongoing in CTIA, ...... correlation, spatial correlation and cross polarization ratio, can be accurcately reproduced in a multi-probe anechoic chamber based MIMO OTA setup....

  3. On the Viscoelastic Parameters of Gussasphalt Mixture Based on Modified Burgers Model: Deviation and Experimental Verification

    OpenAIRE

    Faxiang Xie; Dengjing Zhang; Ao Zhou; Bohai Ji; Lin Chen

    2017-01-01

    Viscoelasticity is an important characteristic of gussasphalt mixtures. The aim of this study is to find the correct viscoelastic material parameters of the novel gussasphalt applied in the 4th Yangtze River Bridge based on the modified Burgers model. This study firstly derives the explicit Prony series form of the shear relaxation modulus of viscoelastic material from Laplace transformation, to fulfill the parameter inputting requirements of commonly used finite element software suites. Seco...

  4. Automatic Fixture Design Based on Formal Knowledge Representation, Design Synthesis and Verification

    OpenAIRE

    Gmeiner, Thomas

    2015-01-01

    Automating the design and configuration processes of fixture devices can increase the flexibility and degree of automation of manufacturing systems. This thesis presents an automatic fixture design system integrated with a reconfigurable fixture device that addresses shortcomings of existing solutions. The system is based on a formal ontology for knowledge representation and semantic reasoning, a spatial grammar for the synthesis of new fixture candidates, a tool-fixture interference analysis...

  5. Mechatronics design and experimental verification of an electric-vehicle-based hybrid thermal management system

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Hung

    2016-02-01

    Full Text Available In this study, an electric-vehicle-based thermal management system was designed for dual energy sources. An experimental platform developed in a previous study was modified. Regarding the mechanical components, a heat exchanger with a radiator, proportional valve, coolant pipes, and coolant pump was appropriately integrated. Regarding the electric components, two heaters emulating waste heat were controlled using two programmable power supply machines. A rapid-prototyping controller with two temperature inputs and three outputs was designed. Rule-based control strategies were coded to maintain optimal temperatures for the emulated proton exchange membrane fuel cells and lithium batteries. To evaluate the heat power of dual energy sources, driving cycles, energy management control, and efficiency maps of energy sources were considered for deriving time-variant values. The main results are as follows: (a an advanced mechatronics platform was constructed; (b a driving cycle simulation was successfully conducted; and (c coolant temperatures reached their optimal operating ranges when the proportional valve, radiator, and coolant pump were sequentially controlled. The benefits of this novel electric-vehicle-based thermal management system are (a high-efficiency operation of energy sources, (b low occupied volume integrated with energy sources, and (c higher electric vehicle traveling mileage. This system will be integrated with real energy sources and a real electric vehicle in the future.

  6. Formal Verification of Safety Buffers for Sate-Based Conflict Detection and Resolution

    Science.gov (United States)

    Herencia-Zapana, Heber; Jeannin, Jean-Baptiste; Munoz, Cesar A.

    2010-01-01

    The information provided by global positioning systems is never totally exact, and there are always errors when measuring position and velocity of moving objects such as aircraft. This paper studies the effects of these errors in the actual separation of aircraft in the context of state-based conflict detection and resolution. Assuming that the state information is uncertain but that bounds on the errors are known, this paper provides an analytical definition of a safety buffer and sufficient conditions under which this buffer guarantees that actual conflicts are detected and solved. The results are presented as theorems, which were formally proven using a mechanical theorem prover.

  7. A model based design framework for safety verification of a semi-autonomous inspection drone

    OpenAIRE

    Mcaree, O.; Aitken, J. M.; Veres, S.M.

    2016-01-01

    In this paper, we present a model based design approach to the development of a semi-autonomous control system for an inspection drone. The system is tasked with maintaining a set distance from the target being inspected and a constant relative pose, allowing the operator to manoeuvre the drone around the target with ease. It is essential that the robustness of the autonomous behaviour be thoroughly verified prior to actual implementation, as this will involve the flight of a large multi-roto...

  8. Software-defined Radio Based Wireless Tomography: Experimental Demonstration and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Hu, Zhen [Tennessee Technological University; Guo, Terry N. [Tennessee Technological University; Qiu, Robert C. [Tennessee Technological University; Browning, James P. [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Wicks, Michael C. [University of Dayton Research Institute

    2015-01-01

    This letter presents an experimental demonstration of software-defined-radio-based wireless tomography using computer-hosted radio devices called Universal Software Radio Peripheral (USRP). This experimental brief follows our vision and previous theoretical study of wireless tomography that combines wireless communication and RF tomography to provide a novel approach to remote sensing. Automatic data acquisition is performed inside an RF anechoic chamber. Semidefinite relaxation is used for phase retrieval, and the Born iterative method is utilized for imaging the target. Experimental results are presented, validating our vision of wireless tomography.

  9. On the Viscoelastic Parameters of Gussasphalt Mixture Based on Modified Burgers Model: Deviation and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Faxiang Xie

    2017-01-01

    Full Text Available Viscoelasticity is an important characteristic of gussasphalt mixtures. The aim of this study is to find the correct viscoelastic material parameters of the novel gussasphalt applied in the 4th Yangtze River Bridge based on the modified Burgers model. This study firstly derives the explicit Prony series form of the shear relaxation modulus of viscoelastic material from Laplace transformation, to fulfill the parameter inputting requirements of commonly used finite element software suites. Secondly, a kind of uniaxial penetration creep experiment on the gussasphalt mixtures is conducted. By fitting the creep compliance, the viscoelastic parameters characterized by the modified Burgers model are obtained. And thirdly, based on the viscoelastic test data of asphalt mixtures, the Prony series formula derived in this study is verified through the finite element simulation. The comparison results of the relative errors between the finite element simulation and the theoretical calculation confirm the reliability of the Prony series formulas deduced in this research. And finally, a stress-correcting method is proposed, which can significantly improve the accuracy of model parameters identification and reduce the relative error between the finite element simulation and the experimental data.

  10. Validation of a deformable image registration technique for cone beam CT-based dose verification.

    Science.gov (United States)

    Moteabbed, M; Sharp, G C; Wang, Y; Trofimov, A; Efstathiou, J A; Lu, H-M

    2015-01-01

    As radiation therapy evolves toward more adaptive techniques, image guidance plays an increasingly important role, not only in patient setup but also in monitoring the delivered dose and adapting the treatment to patient changes. This study aimed to validate a method for evaluation of delivered intensity modulated radiotherapy (IMRT) dose based on multimodal deformable image registration (dir) for prostate treatments. A pelvic phantom was scanned with CT and cone-beam computed tomography (CBCT). Both images were digitally deformed using two realistic patient-based deformation fields. The original CT was then registered to the deformed CBCT resulting in a secondary deformed CT. The registration quality was assessed as the ability of the dir method to recover the artificially induced deformations. The primary and secondary deformed CT images as well as vector fields were compared to evaluate the efficacy of the registration method and it's suitability to be used for dose calculation. plastimatch, a free and open source software was used for deformable image registration. A B-spline algorithm with optimized parameters was used to achieve the best registration quality. Geometric image evaluation was performed through voxel-based Hounsfield unit (HU) and vector field comparison. For dosimetric evaluation, IMRT treatment plans were created and optimized on the original CT image and recomputed on the two warped images to be compared. The dose volume histograms were compared for the warped structures that were identical in both warped images. This procedure was repeated for the phantom with full, half full, and empty bladder. The results indicated mean HU differences of up to 120 between registered and ground-truth deformed CT images. However, when the CBCT intensities were calibrated using a region of interest (ROI)-based calibration curve, these differences were reduced by up to 60%. Similarly, the mean differences in average vector field lengths decreased from 10.1 to 2

  11. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  12. Development and verification of a compact TDC-based data acquisition system for space applications

    Energy Technology Data Exchange (ETDEWEB)

    Losekamm, Martin [Physics Department E18, Technische Universitaet Muenchen (Germany); Institute of Astronautics, Technische Universitaet Muenchen (Germany); Gaisbauer, Dominic; Konorov, Igor; Paul, Stephan; Poeschl, Thomas [Physics Department E18, Technische Universitaet Muenchen (Germany)

    2015-07-01

    The advances of solid-state detectors and in particular those for the detection of photons have made their application in space systems increasingly attractive in recent years. The use of, for example, silicon photomultipliers (SiPM) paired with a suitable scintillating material allows the development of compact and lightweight particle detectors. The Antiproton Flux in Space experiment (AFIS) intends to measure the flux of antiprotons trapped in Earth's magnetosphere aboard a nanosatellite using an active target tracking detector, consisting of plastic scintillating fibers read out by SiPMs. In order to implement a large number of detector channels while adhering to the given space, mass and power constraints, the development of a compact TDC-based data acquisition system was proposed. This talk presents a current prototype featuring 900 channels, real-time multi-channel temperature measurement and bias regulation. Possible alternative applications as well as the next steps in the development are also discussed.

  13. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  14. Verification for Different Contrail Parameterizations Based on Integrated Satellite Observation and ECMWF Reanalysis Data

    Directory of Open Access Journals (Sweden)

    Jinglin Zhang

    2017-01-01

    Full Text Available Aviation induced cloud termed contrail plays a more and more important role in the climate change, which makes a significant contribution to anthropogenic climate forcing through impacting the coverage of cirrus in the intersection of troposphere and stratosphere. In this paper, we propose one novel automatic contrail detecting method based on Himawari-8 stationary satellite imagery and two kinds of potential contrail coverage (PCC1 and PCC2 from contrail parameterization in ECHAM4 and HadGEM2. In addition, we propose one new climatological index called contrail occurrence and persistence (COP. According to the algorithm identification (AI and artificial visual inspection (AVI, COP measured from Himawari-8 stationary satellite imagery is related to upper tropospheric relative humidity over ice (RHI computed with the ECMWF reanalysis data by simple linear regression. Similarly, we compared the linear correlation between COP and PCCs fractions and found that PCC1 has better correspondence with COP than PCC2.

  15. Experimental Verification of Electric Drive Technologies Based on Artificial Intelligence Tools

    Science.gov (United States)

    Rubaai, Ahmed; Kankam, David (Technical Monitor)

    2003-01-01

    A laboratory implementation of a fuzzy logic-tracking controller using a low cost Motorola MC68HC11E9 microprocessor is described in this report. The objective is to design the most optimal yet practical controller that can be implemented and marketed, and which gives respectable performance, even when the system loads, inertia and parameters are varying. A distinguishing feature of this work is the by-product goal of developing a marketable, simple, functional and low cost controller. Additionally, real-time nonlinearities are not ignored, and a mathematical model is not required. A number of components have been designed, built and tested individually, and in various combinations of hardware and software segments. These components have been integrated with a brushless motor to constitute the drive system. A microprocessor-based FLC is incorporated to provide robust speed and position control. Design objectives that are difficult to express mathematically can be easily incorporated in a fuzzy logic-based controller by linguistic information (in the form of fuzzy IF-THEN rules). The theory and design are tested in the laboratory using a hardware setup. Several test cases have been conducted to confirm the effectiveness of the proposed controller. The results indicate excellent tracking performance for both speed and position trajectories. For the purpose of comparison, a bang-bang controller has been tested. The fuzzy logic controller performs significantly better than the traditional bang-bang controller. The bang-bang controller has been shown to be relatively inaccurate and lacking in robustness. Description of the implementation hardware system is also given.

  16. Verification of Compressible and Incompressible Computational Fluid Dynamics Codes and Residual-based Mesh Adaptation

    Science.gov (United States)

    Choudhary, Aniruddha

    Code verifition is the process of ensuring, to the degree possible, that there are no algorithm deficiencies and coding mistakes (bugs) in a scientific computing simulation. In this work, techniques are presented for performing code verifition of boundary conditions commonly used in compressible and incompressible Computational Fluid Dynamics (CFD) codes. Using a compressible CFD code, this study assesses the subsonic in flow (isentropic and fixed-mass), subsonic out ow, supersonic out ow, no-slip wall (adiabatic and isothermal), and inviscid slip-wall. The use of simplified curved surfaces is proposed for easier generation of manufactured solutions during the verifition of certain boundary conditions involving many constraints. To perform rigorous code verifition, general grids with mixed cell types at the verified boundary are used. A novel approach is introduced to determine manufactured solutions for boundary condition verifition when the velocity-field is constrained to be divergence-free during the simulation in an incompressible CFD code. Order of accuracy testing using the Method of Manufactured Solutions (MMS) is employed here for code verifition of the major components of an open-source, multiphase ow code - MFIX. The presence of two-phase governing equations and a modified SIMPLE-based algorithm requiring divergence-free flows makes the selection of manufactured solutions more involved than for single-phase, compressible flows. Code verifition is performed here on 2D and 3D, uniform and stretched meshes for incompressible, steady and unsteady, single-phase and two-phase flows using the two-fluid model of MFIX. In a CFD simulation, truncation error (TE) is the difference between the continuous governing equation and its discrete approximation. Since TE can be shown to be the local source term for the discretization error, TE is proposed as the criterion for determining which regions of the computational mesh should be refined/coarsened. For mesh

  17. Damping Characterization of Friction Energy Dissipation for Particle Systems Based on Powder Mechanics and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Wangqiang XIAO

    2014-04-01

    Full Text Available We established a friction energy dissipation model for particle damping based on powder mechanics. We investigated the influence of geometric features of the damper on damping characteristics; and the geometric feature studied was the depth and length of the rectangular particle container. The work done by the frictional force between the particle layer and the effect of particle filling rate on the vibration damping characteristics was also explored. We analyzed the friction energy dissipation model, and the relationship between the particle filling rate and the vibration damping. The experimental results show good agreement with the friction energy dissipation model, which verifies the proposed simulation prediction. The results have shown that the particle damping technology can greatly consume the structure kinetic energy, and the vibration reduction effect of particle damping depends mainly on the interaction of the particles near the top. A proper filling rate of particle systems can result in an optimal effect on vibration reduction, which will provide the engineering applications with the theoretical guidance and design criteria.

  18. Verification of Flood Wave Propagation Model of the Caspian Sea Based on the Satellite Altimetry Data

    Science.gov (United States)

    Lebedev, Sergey

    In this research simple flood wave propagation model was based the Saint-Venant equations represented a good way to describe problems concerning with flood waves propagations in open channels. For solution of this task the Caspian Sea was approximated as channel with a rectangular section. Channel axis coincided with the sea longitudinal axis or location of descending pass 092 of satellites TOPEX/Poseidon and Jason-1/2. Altimetric measurements of this satellites permit to define more exactly empiric parameters of the flood wave (propagation speed amplitude et al.) which are solution of the model. Also it allows estimating of effective evaporation. In this approach it is possible to consider as an integrated difference between sea surface heights between previous and the subsequent cycles altimetric measurements. Results of calculations have confirmed well conformity given calculated by other researchers and the model. As is shown than interannual variability of flood wave speed in the North Caspian was well correlated with interannual the Caspian Sea level variability. This study was supported by the grant of the Russian Foundation for Basic Research.

  19. Implementation and verification of different ECC mitigation designs for BRAMs in flash-based FPGAs

    Science.gov (United States)

    Yang, Zhen-Lei; Wang, Xiao-Hui; Zhang, Zhan-Gang; Liu, Jie; Su, Hong

    2016-04-01

    Embedded RAM blocks (BRAMs) in field programmable gate arrays (FPGAs) are susceptible to single event effects (SEEs) induced by environmental factors such as cosmic rays, heavy ions, alpha particles and so on. As technology scales, the issue will be more serious. In order to tackle this issue, two different error correcting codes (ECCs), the shortened Hamming codes and shortened BCH codes, are investigated in this paper. The concrete design methods of the codes are presented. Also, the codes are both implemented in flash-based FPGAs. Finally, the synthesis report and simulation results are presented in the paper. Moreover, heavy-ion experiments are performed, and the experimental results indicate that the error cross-section of the device using the shortened Hamming codes can be reduced by two orders of magnitude compared with the device without mitigation, and no errors are discovered in the experiments for the device using the shortened BCH codes. Supported by National Natural Science Foundation of China (11079045, 11179003 and 11305233)

  20. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  1. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  2. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  3. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  4. Termination Proofs for String Rewriting Systems via Inverse Match-Bounds

    Science.gov (United States)

    Butler, Ricky (Technical Monitor); Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2004-01-01

    Annotating a letter by a number, one can record information about its history during a reduction. A string rewriting system is called match-bounded if there is a global upper bound to these numbers. In earlier papers we established match-boundedness as a strong sufficient criterion for both termination and preservation of regular languages. We show now that the string rewriting system whose inverse (left and right hand sides exchanged) is match-bounded, also have exceptional properties, but slightly different ones. Inverse match-bounded systems effectively preserve context-free languages; their sets of normalized strings and their sets of immortal strings are effectively regular. These sets of strings can be used to decide the normalization, the termination and the uniform termination problems of inverse match-bounded systems. We also show that the termination problem is decidable in linear time, and that a certain strong reachability problem is deciable, thus solving two open problems of McNaughton's.

  5. Alignment Verification in the Early Stage of Service Design

    OpenAIRE

    Tapandjieva, Gorica; Filipponi, Matteo; Wegmann, Alain

    2017-01-01

    Verification is a costly task, sometimes burdensome and tedious, requiring strong formal background. To reduce the effort and cost invested in verification, we developed a model-driven approach for automatic verification of service properties, done in the early service design phase. Our approach is based on SEAM, a service modeling method, and it incorporates a verification system called Leon. With our approach service designers do not need substantial understanding of specific formal and ver...

  6. Mansfield as (post)colonial-modernist: rewriting the contract with death

    OpenAIRE

    Wilson, Janet M

    2013-01-01

    Mansfield’s rewriting of ‘the contract with death’ explores the implications for her late work of her vow to commemorate in prose the life of her brother, Leslie Heron Beauchamp, and so restore him metaphorically to life. Drawing on recent studies of late Victorian and early twentieth-century cultural practices of the occult, telepathy and mediumship through seances, especially after World War One, the article suggests that Mansfield turned to alternative cultural and communicative transmissi...

  7. Journalism’s Rewriting of History in Reporting the Arab Spring

    OpenAIRE

    Hanne Jørndrup

    2012-01-01

    Investigation of journalism’s role as writer and rewriter of the record of political episodes of world importance is central to this article, which takes an empirical approach in choosing the Danish press coverage of The Arab Spring as its starting point. The article analyses how a number of historical references to, in particular, European revolutionary history from Eastern Europe in 1989, are woven into the journalistic descriptions of events in Tunisia and Egypt. But the ana...

  8. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  9. An automatic and robust point cloud registration framework based on view-invariant local feature descriptors and transformation consistency verification

    Science.gov (United States)

    Cheng, Xu; Li, Zhongwei; Zhong, Kai; Shi, Yusheng

    2017-11-01

    This paper presents an automatic and robust framework for simultaneously registering pairwise point clouds and identifying the correctness of registration results. Given two partially overlapping point clouds with arbitrary initial positions, a view-invariant local feature descriptor is utilized to build sparse correspondence. A geometry constraint sample consensus (GC-SAC) algorithm is proposed to prune correspondence outliers and obtain an optimal 3D transformation hypothesis. Furthermore, by measuring the similarity between the estimated local and global transformations, a transformation consistency verification method is presented to efficiently detect potential registration failures. Our method provides reliable registration correctness verification even when two point clouds are only roughly registered. Experimental results demonstrate that our framework exhibits high levels of effectiveness and robustness for automatic registration.

  10. Journalism’s Rewriting of History in Reporting the Arab Spring

    Directory of Open Access Journals (Sweden)

    Hanne Jørndrup

    2012-05-01

    Full Text Available Investigation of journalism’s role as writer and rewriter of the record of political episodes of world importance is central to this article, which takes an empirical approach in choosing the Danish press coverage of The Arab Spring as its starting point. The article analyses how a number of historical references to, in particular, European revolutionary history from Eastern Europe in 1989, are woven into the journalistic descriptions of events in Tunisia and Egypt. But the analysis also reflects on journalism’s own historical precedents in that field. Therefore, this paper takes the topics and circumstances that put Tunisia and Egypt on the Danish media’s agenda in the year before the Arab revolutions as a starting point. The central point of this comparison is to convey how journalism, while describing contemporary events of The Arab Spring, at the same time rewrites its own prior commentary on the region. Rewriting history in this way gives journalism a neutral and unassailable position as observer of events of world-wide importance, but it brings in its train other problems with staying true to both the readers and to unfolding events.

  11. Design and implementation of embedded hardware accelerator for diagnosing HDL-CODE in assertion-based verification environment

    Directory of Open Access Journals (Sweden)

    C. U. Ngene

    2013-08-01

    Full Text Available The use of assertions for monitoring the designer’s intention in hardware description language (HDL model is gaining popularity as it helps the designer to observe internal errors at the output ports of the device under verification. During verification assertions are synthesised and the generated data are represented in a tabular forms. The amount of data generated can be enormous depending on the size of the code and the number of modules that constitute the code. Furthermore, to manually inspect these data and diagnose the module with functional violation is a time consuming process which negatively affects the overall product development time. To locate the module with functional violation within acceptable diagnostic time, the data processing and analysis procedure must be accelerated. In this paper a multi-array processor (hardware accelerator was designed and implemented in Virtex6 field programmable gate array (FPGA and it can be integrated into verification environment. The design was captured in very high speed integrated circuit HDL (VHDL. The design was synthesised with Xilinx design suite ISE 13.1 and simulated with Xilinx ISIM. The multi-array processor (MAP executes three logical operations (AND, OR, XOR and a one’s compaction operation on array of data in parallel. An improvement in processing and analysis time was recorded as compared to the manual procedure after the multi-array processor was integrated into the verification environment. It was also found that the multi-array processor which was developed as an Intellectual Property (IP core can also be used in applications where output responses and golden model that are represented in the form of matrices can be compared for searching, recognition and decision-making.

  12. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    Directory of Open Access Journals (Sweden)

    Boeken Kruger Arto E

    2008-05-01

    Full Text Available Abstract We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment and weekly during treatment (acute toxicity were scored using the Common Toxicity Criteria (CTC. The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU complaints and 2% experienced grade 2 gastrointestinal (GI complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4. In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used.

  13. Improved method for coliform verification.

    OpenAIRE

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay.

  14. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  15. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  16. TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  17. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels

    2018-01-01

    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system....... The system shows promising performance on the database with an accuracy of 99.71% overall and is robust to the variations in parking areas and weather conditions....

  18. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  19. Baghouse filtration products verification

    Energy Technology Data Exchange (ETDEWEB)

    Mycock, J.C.; Turner, J.H.; VanOsdell, D.W.; Farmer, J.R.; Brna, T.G.

    1998-11-01

    The paper introduces EPA`s Air Pollution Control Technology Verification (APCT) program and then focuses on the immediate objective of the program: laboratory performance verification of cleanable filter media intended for the control of fine particulate emissions. Data collected during the laboratory verification testing, which simulates operation in full-scale fabric filters, will be used to show expected performance for collection of particles {le} 2.5 micrometers in diameter.

  20. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications

    Directory of Open Access Journals (Sweden)

    Kobayashi Hiroki

    2012-04-01

    Full Text Available Abstract Background Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. Results We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system. As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. Conclusions This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  1. Program Verification and System Dependability

    Science.gov (United States)

    Jackson, Michael

    Formal verification of program correctness is a long-standing ambition, recently given added prominence by a “Grand Challenge” project. Major emphases have been on the improvement of languages for program specification and program development, and on the construction of verification tools. The emphasis on tools commands general assent, but while some researchers focus on narrow verification aimed only at program correctness, others want to pursue wide verification aimed at the larger goal of system dependability. This paper presents an approach to system dependability based on problem frames and suggests how this approach can be supported by formal software tools. Dependability is to be understood and evaluated in the physical and human problem world of a system. The complexity and non-formal nature of the problem world demand the development and evolution of normal designs and normal design practices for specialised classes of systems and subsystems. The problem frames discipline of systems analysis and development that can support normal design practices is explained and illustrated. The role of formal reasoning in achieving dependability is discussed and some conceptual, linguistic and software tools are suggested.

  2. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  3. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  4. The Politics of Rewriting History: New History Textbooks and Curriculum Materials in Russia

    Science.gov (United States)

    Zajda, Joseph; Zajda, Rea

    2003-07-01

    The collapse of communism in Russia in 1991 necessitated, among other things, the rewriting of school history textbooks, which had been dominated by Marxist-Leninist interpretations of historical events. The aim of this article is to evaluate the new postcommunist history taught in upper secondary schools, giving particular attention to how the models for Russian identity presented in the new textbooks redefine legitimate culture for students. Attention will also be given to the multiple perspectives on history that textbooks and other curriculum materials emphasize; these new methods contrast with the grand narrative that dominated the study of history before 1991.

  5. Rewritable Optical Storage with a Spiropyran Doped Liquid Crystal Polymer Film.

    Science.gov (United States)

    Petriashvili, Gia; De Santo, Maria Penelope; Devadze, Lali; Zurabishvili, Tsisana; Sepashvili, Nino; Gary, Ramla; Barberi, Riccardo

    2016-03-01

    Rewritable optical storage has been obtained in a spiropyran doped liquid crystal polymer films. Pictures can be recorded on films upon irradiation with UV light passing through a grayscale mask and they can be rapidly erased using visible light. Films present improved photosensitivity and optical contrast, good resistance to photofatigue, and high spatial resolution. These photochromic films work as a multifunctional, dynamic photosensitive material with a real-time image recording feature. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Verification is experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik

    2001-01-01

    The formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. Although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  7. Verification Is Experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik

    2000-01-01

    the formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  8. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available the performance of innovative environmental technologies can be verified by qualified third parties called "Verification Bodies". The "Statement of Verification" delivered at the end of the ETV process can be used as evidence that the claims made about...

  9. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  10. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  11. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  12. Girls on the Borderline: Rewriting the Rite of Passage Film

    OpenAIRE

    Steiner, Esther

    2011-01-01

    Girl protagonists in rite of passage films regularly come to be burdened with a sobering maturity that sees them acquire a dysphoric subjective position under an oppressive patriarchal paradigm. According to Oedipal logics, both genders, in extricating themselves from the imaginary fullness of the maternal bond, come to be subjects of lack, but culturally entrenched patriarchal fictions concur in fostering masculine narcissism at the expense of the feminine. This practice-based research asks ...

  13. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    Science.gov (United States)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC-based

  14. The Pi-0-2-Completeness of most of the Properties of Rewriting You Care About (and Productivity)

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2009-01-01

    Most of the standard pleasant properties of term rewriting systems are undecidable; to wit: local confluence, confluence, normalization, termination, and completeness. Mere undecidability is insufficient to rule out a number of possibly useful properties: For instance, if the set of normalizing...... of being a productive specification of a stream) complete for the class $\\Pi^0_2$. Thus, there is neither a program that can enumerate the set of rewriting systems enjoying any one of the properties, nor is there a program enumerating the set of systems that do not. For normalization and termination we...

  15. WE-EF-303-03: A New Aperture-Based Imaging System for Prompt-Gamma Range Verification of Proton Beam Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Ready, J; Pak, R [UC Berkeley, Berkeley, CA (United States); Mihailescu, L [Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Vetter, K [UC Berkeley, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2015-06-15

    Purpose: To develop and characterize a novel aperture-based imaging system for high-energy gamma-rays. This collimated system will provide 2-dimensional imaging capability for verification of proton beam range and Bragg peak dose via prompt-gamma detection. Methods: A multi-knife-edge slit collimator has been designed, constructed, and characterized via simulations and experimental measurements. The 20×20×7.5 cm{sup 3} tungsten collimator and accompanying LSO scintillation detector were simulated using the TOPAS Geant4 -based Monte Carlo package. Iterative reconstruction methods were combined with point response functions to characterize the imaging performance of the system. The response of the system has begun to be characterized experimentally as well, using 2.6 MeV gamma-rays from Th-228. Results: Both simulation and experimental results indicate that this collimated system provides 2-D imaging capability in the energy range of interest for prompt-gamma dose verification. In the current configuration, with collimator to source distance of 13 cm, image reconstruction of point sources resulted in spatial resolution (FWHM) of approximately 4 mm in both x-and y-directions in the imaging plane. The accuracy of positioning the point sources is less than 1 mm. Conclusion: This work has characterized, via simulation and measurements, a novel multi-knife-edge slit collimator in front of a more conventional position-sensitive LSO scintillator detector. The multi-slit pattern is designed to increase detection efficiency and provide spatial information in 2-dimensions -- an improvement over a single-slit collimator design. The thickness and density of the collimator will allow this detection system to perform well in an environment with high gamma flux, while ultimately providing peak determination accuracy on the order of 1 mm. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number: DE-NA0000979

  16. Kleene Algebra and Bytecode Verification

    Science.gov (United States)

    2016-04-27

    Bytecode 2005 Preliminary Version Kleene Algebra and Bytecode Verification Lucja Kot 1 Dexter Kozen 2 Department of Computer Science Cornell...first-order methods that inductively annotate program points with abstract values. In [6] we introduced a second-order approach based on Kleene algebra ...form a left-handed Kleene algebra . The dataflow labeling is not achieved by inductively labeling the program with abstract values, but rather by

  17. Good grief: Lord of the Flies as a post-war rewriting of salvation history

    Directory of Open Access Journals (Sweden)

    M. van Vuuren

    2004-07-01

    Full Text Available Golding’s Lord of the Flies, first published in 1954, reflects a bleak sense of post-war pessimism. But with undue attention focused on its portrayal of original sin and the problem of evil, readings have often remained reductive. In this article it is argued that the novel’s symbolic narrative is polysemic and, when it is read as anagogic myth, may be seen to span Judaeo-Christian Heilsgeschichte or salvation history, rewriting its chapters of creation, Fall, the problem of evil, the failure of law, the hope of salvation, the mission of a messianic figure, and – in the clearest departure from the Biblical narrative – an ambiguous representation of his return. This study examines the novel’s often paradoxical symbolism using Frye’s phases of anagogic myth, with its poles of apocalyptic and demonic imagery. It traces the relation of symbols to their counterparts in Biblical narratives, drawn in particular from the symbolic writings of the origin and end of humanity, to elucidate Golding’s bleak but certainly not hopeless rewriting of the salvation story for a post-faith readership.

  18. Like Water for Chocolate: The Rewriting of the Female Experience and Its Parallels in Philippine History

    Directory of Open Access Journals (Sweden)

    Marikit Tara Alto Uychoco

    2012-06-01

    Full Text Available This article focuses on Laura Esquivel’s Like Water for Chocolate and reads the novel using the literary theories of the “new mestiza,” postcolonial theories, feminist theories, and historiographic metafiction. It seeks to find out how this novel rewrites the female experience of the Mexican Revolution, and the various techniques used in the rewriting of history. It reads the novel from a “new mestiza” feminist perspective, which enables the Filipina reader to find commonalities in the Mexican woman’s struggle in Mexican history and society, and finds ways to help her appreciate the Filipina’s struggle in Philippine history and society. The theories of historiographic metafiction are grounded in Linda Hutcheon’s theories about historiography, or the writing of history, and metafiction, or fiction that makes us aware of the craft of fiction. The theories regarding the “new mestiza” consciousness are from Gloria Anzaldua. This is a feminist theory that is contextualized on the historic oppression of women during Spanish colonization and its resulting patriarchal structures in society, and how women can seek to free themselves from such residual structures. Finally, the article touches upon a Filipina feminist perspective on the novel and what it signifies for the Philippine female experience.

  19. History and the Popular: Rewriting National Origins at the Argentine Bicentenary

    Directory of Open Access Journals (Sweden)

    Catriona McAllister

    2016-04-01

    Full Text Available This article explores two texts that offer a self-conscious, metafictional rewriting of Argentina’s founding revolution in May 1810 at the time of the nation’s Bicentenary. It aims to draw out the political focus of both texts (a novel by Washington Cucurto and a play by Manuel Santos Iñurrieta by analysing the ways in which they draw on heavily politicized historical discourses in their fictional appropriations of this moment of origin. This analysis leads to the emergence of two very different ideas of the popular in both works, one closely related to Peronist discourse and the other entwined with the Marxist concept of the proletariat. This article therefore argues for the need to reconsider the definitions of the relationship between literature and history that emerge from postmodernist theory, definitions which centre on the epistemological relationship between ‘fiction’ and ‘fact’. Instead, it proposes a foregrounding of public discourses of history, often employed as political tools, in order to perceive a far more detailed engagement with the political in literary texts that rewrite history.This article is published under a CC-BY license https://creativecommons.org/licenses/by/3.0/

  20. Guidelines for Formal Verification Systems

    Science.gov (United States)

    1989-04-01

    This document explains the requirements for formal verification systems that are candidates for the NCSC’s Endorsed Tools List (ETL). This document...is primarily intended for developers of verification systems to use in the development of production-quality formal verification systems. It explains...the requirements and the process used to evaluate formal verification systems submitted to the NCSC for endorsement.

  1. Clinical evaluation of 3D/3D MRI-CBCT automatching on brain tumors for online patient setup verification - A step towards MRI-based treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Buhl, Sune K.; Kristensen, Brian H.; Behrens, Claus F. (Dept. of Oncology, Copenhagen Univ. Hospital, DK-2730 Herlev (Denmark)), E-mail: sukrbu01@heh.regionh.dk; Duun-Christensen, Anne K. (Dept. of Informatics and Mathematical Modeling, Technical Univ. of Denmark, DK-2800 Kgs. Lyngby (Denmark))

    2010-10-15

    Background. Magnetic Resonance Imaging (MRI) is often used in modern day radiotherapy (RT) due to superior soft tissue contrast. However, treatment planning based solely on MRI is restricted due to e.g. the limitations of conducting online patient setup verification using MRI as reference. In this study 3D/3D MRI-Cone Beam CT (CBCT) automatching for online patient setup verification was investigated. Material and methods. Initially, a multi-modality phantom was constructed and used for a quantitative comparison of CT-CBCT and MRI-CBCT automatching. Following the phantom experiment three patients undergoing postoperative radiotherapy for malignant brain tumors received a weekly CBCT. In total 18 scans was matched with both CT and MRI as reference. The CBCT scans were acquired using a Clinac iX 2300 linear accelerator (Varian Medical Systems) with an On-Board Imager (OBI). Results. For the phantom experiment CT-CBCT and MRI-CBCT automatching resulted in similar results. A significant difference was observed only in the longitudinal direction where MRI-CBCT resulted in the best match (mean and standard deviations of 1.85+-2.68 mm for CT and -0.05+-2.55 mm for MRI). For the clinical experiment the absolute difference in couch shift coordinates acquired from MRI-CBCT and CT-CBCT automatching, were =2 mm in the vertical direction and =3 mm in the longitudinal and lateral directions. For yaw rotation differences up to 3.3 degrees were observed. Mean values and standard deviations were 0.8+-0.6 mm, 1.5+-1.2 mm and 1.2+-1.2 mm for the vertical, longitudinal and lateral directions, respectively and 1.95+-1.12 degrees for the rotation (n=17). Conclusion. It is feasible to use MRI as reference when conducting 3D/3D CBCT automatching for online patient setup verification. For both the phantom and clinical experiment MRI-CBCT performed similar to CT-CBCT automatching and significantly better in the longitudinal direction for the phantom experiment.

  2. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    Science.gov (United States)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about

  3. Voltage verification unit

    Science.gov (United States)

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  4. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  5. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  6. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  7. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min, E-mail: jewellee@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Hyun Chul, E-mail: leehc@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Ha, Jun Su, E-mail: junsu.ha@kustar.ac.ae [Department of Nuclear Engineering, Khalifa University of Science Technology and Research, Abu Dhabi P.O. Box 127788 (United Arab Emirates); Seong, Poong Hyun, E-mail: phseong@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2016-10-15

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  8. Testing Equation Method Modification for Demanding Energy Measurements Verification

    Directory of Open Access Journals (Sweden)

    Elena Kochneva

    2016-01-01

    Full Text Available The paper is devoted to the mathematical approaches of the measurements received from Automatic Meter Reading Systems verification. Reliability of metering data can be improved by application of the new issue named Energy Flow Problem. The paper considers demanding energy measurements verification method based on verification expressions groups analysis. Bad data detection and estimates accuracy calculation is presented using the Automatic Meter Reading system data from the Russian power system fragment.

  9. Machine Code Verification of a Tiny ARM Hypervisor

    OpenAIRE

    Dam, Mads; Guanciale, Roberto; Nemati, Hamed

    2013-01-01

    Hypervisors are low level execution platforms that provideisolated partitions on shared resources, allowing to design se-cure systems without using dedicated hardware devices. Akey requirement of this kind of solution is the formal verifi-cation of the software trusted computing base, preferably atthe binary level. We accomplish a detailed verification of anARMv7 tiny hypervisor, proving its correctness at the ma-chine code level. We present our verification strategy, whichmixes the usage of ...

  10. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  11. Gender verification in competitive sports.

    Science.gov (United States)

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    these problems remain with the current laboratory based gender verification test, polymerase chain reaction based testing of the SRY gene, the main candidate for male sex determination. Thus, this 'advance' in fact still fails to address the fundamental inequities of laboratory based gender verification tests. The IAAF considered the issue in 1991 and 1992, and concluded that gender verification testing was not needed. This was thought to be especially true because of the current use of urine testing to exclude doping: voiding is observed by an official in order to verify that a sample from a given athlete has actually come from his or her urethra. That males could masquerade as females in these circumstances seems extraordinarily unlikely. Screening for gender is no longer undertaken at IAAF competitions.

  12. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  13. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  14. True Nations and Half People: Rewriting Nationalism in Alasdair Gray�s Poor Things

    Directory of Open Access Journals (Sweden)

    David Leishman

    2013-11-01

    Full Text Available This article seeks to explore the apparent contradictions between postmodernism and political nationalism in Alasdair Gray's novel Poor Things. While Gray himself has spoken out in favour of an independent Scottish republic, his ironic, self-referential fiction has often been characterised as a mode of writing whose irreconcilable paradoxes work against political engagement. This issue is studied as regards nationalism, particularly as Poor Things raises the question of how nations are constructed through their literature. Since Poor Things abounds in imagery of hybridity and duality, it is argued that any presumption of wholeness and unicity in the nation is necessarily to be treated with caution. However, through a study of the rival political discourses that permeate Poor Things, it appears that Scottish nationalism is not necessarily incompatible with a politicised form of postmodernist writing. Indeed, Poor Things' key themes of authorial power, contradictory discourses and rewriting are particularly pertinent to the question of nationalism.

  15. Symbolic Evaluation Graphs and Term Rewriting — A General Methodology for Analyzing Logic Programs

    DEFF Research Database (Denmark)

    Giesl, J.; Ströder, T.; Schneider-Kamp, P.

    2013-01-01

    There exist many powerful techniques to analyze termination and complexity of term rewrite systems (TRSs). Our goal is to use these techniques for the analysis of other programming languages as well. For instance, approaches to prove termination of definite logic programs by a transformation...... to TRSs have been studied for decades. However, a challenge is to handle languages with more complex evaluation strategies (such as Prolog, where predicates like the cut influence the control flow). We present a general methodology for the analysis of such programs. Here, the logic program is first...... information on the termination or complexity of the original logic program. More information can be found in the full paper [1]. © 2013 Springer-Verlag....

  16. Formal Verification at System Level

    Science.gov (United States)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  17. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  18. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  19. Safe Neighborhood Computation for Hybrid System Verification

    Directory of Open Access Journals (Sweden)

    Yi Deng

    2015-01-01

    Full Text Available For the design and implementation of engineering systems, performing model-based analysis can disclose potential safety issues at an early stage. The analysis of hybrid system models is in general difficult due to the intrinsic complexity of hybrid dynamics. In this paper, a simulation-based approach to formal verification of hybrid systems is presented.

  20. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  1. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    Science.gov (United States)

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  2. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  3. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic specificati...

  4. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  5. Grip-pattern verification for a smart gun

    NARCIS (Netherlands)

    Shang, X.; Groenland, J.P.J.; Groenland, J.P.J.; Veldhuis, Raymond N.J.

    In the biometric verification system of a smart gun, the rightful user of the gun is recognized based on grip-pattern recognition. It was found that the verification performance of grip-pattern recognition degrades strongly when the data for training and testing the classifier, respectively, have

  6. SU-G-BRB-11: On the Sensitivity of An EPID-Based 3D Dose Verification System to Detect Delivery Errors in VMAT Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, P; Olaciregui-Ruiz, I; Mijnheer, B; Mans, A; Rozendaal, R [Netherlands Cancer Institute - Antoni van Leeuwenhoek, Amsterdam, Noord-Holland (Netherlands)

    2016-06-15

    Purpose: To investigate the sensitivity of an EPID-based 3D dose verification system to detect delivery errors in VMAT treatments. Methods: For this study 41 EPID-reconstructed 3D in vivo dose distributions of 15 different VMAT plans (H&N, lung, prostate and rectum) were selected. To simulate the effect of delivery errors, their TPS plans were modified by: 1) scaling of the monitor units by ±3% and ±6% and 2) systematic shifting of leaf bank positions by ±1mm, ±2mm and ±5mm. The 3D in vivo dose distributions where then compared to the unmodified and modified treatment plans. To determine the detectability of the various delivery errors, we made use of a receiver operator characteristic (ROC) methodology. True positive and false positive rates were calculated as a function of the γ-parameters γmean, γ1% (near-maximum γ) and the PTV dose parameter ΔD{sub 50} (i.e. D{sub 50}(EPID)-D{sub 50}(TPS)). The ROC curve is constructed by plotting the true positive rate vs. the false positive rate. The area under the ROC curve (AUC) then serves as a measure of the performance of the EPID dosimetry system in detecting a particular error; an ideal system has AUC=1. Results: The AUC ranges for the machine output errors and systematic leaf position errors were [0.64 – 0.93] and [0.48 – 0.92] respectively using γmean, [0.57 – 0.79] and [0.46 – 0.85] using γ1% and [0.61 – 0.77] and [ 0.48 – 0.62] using ΔD{sub 50}. Conclusion: For the verification of VMAT deliveries, the parameter γmean is the best discriminator for the detection of systematic leaf position errors and monitor unit scaling errors. Compared to γmean and γ1%, the parameter ΔD{sub 50} performs worse as a discriminator in all cases.

  7. First clinical investigation of a 4D maximum likelihood reconstruction for 4D PET-based treatment verification in ion beam therapy.

    Science.gov (United States)

    Gianoli, Chiara; De Bernardi, Elisabetta; Ricotti, Rosalinda; Kurz, Christopher; Bauer, Julia; Riboldi, Marco; Baroni, Guido; Debus, Jürgen; Parodi, Katia

    2017-06-01

    In clinical applications of Positron Emission Tomography (PET)-based treatment verification in ion beam therapy (PT-PET), detection and interpretation of inconsistencies between Measured PET and Expected PET are mostly limited by Measured PET noise, due to low count statistics, and by Expected PET bias, especially due to inaccurate washout modelling in off-line implementations. In this work, a recently proposed 4D Maximum Likelihood (ML) reconstruction algorithm which considers Measured PET and Expected PET as two different motion phases of a 4D dataset is assessed on clinical 4D PET-CT datasets acquired after carbon ion therapy. The 4D ML reconstruction algorithm estimates: (1) Measured PET of enhanced image quality with respect to the conventional Measured PET, thanks to the exploitation of Expected PET; (2) the deformation field mapping the Expected PET onto the Measured PET as a measure of the occurred displacements. Results demonstrate the desired sensitivity to inconsistencies due to breathing motion and/or setup modification, robustness to noise in different count statistics scenarios, but a limited sensitivity to Expected PET washout inaccuracy. The 4D ML reconstruction algorithm supports clinical 4D PT-PET in ion beam therapy. The limited sensitivity to washout inaccuracy can be detected and potentially overcome. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Verification of rapid method for estimation of added food colorant type in boiled sausages based on measurement of cross section color

    Science.gov (United States)

    Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.

    2017-09-01

    During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.

  9. Experimental Verification on Remote Detectability of Concealed Radioactive Material Based on the Plasma Discharge Delay Time using High-Power Millimeter-Wave

    Science.gov (United States)

    Kim, Dongsung; Yu, Dongho; Sawant, Ashwini; Choe, Mun Seok; Lee, Ingeun; Choi, Eunmi

    2016-10-01

    We experimentally demonstrate a remote detection method of a radioactive source by plasma breakdown using high-power millimeter-wave source, gyrotron. A number of free electrons near the radioactive source are much higher than those of without the radioactive source (roughly 10 particles/cm3) owing to the interaction of air molecules and strong gamma rays generated by radioactive material. The RF wave beam is focused in ambient air, and the plasmas discharge occurs involving random delay time which means a time interval between the RF wave and a fluorescent light caused by the plasma. We observed that the delay time decreased significantly due to the high density of free electrons in Ar plasma with an existence of Co60 radioactive material. This technique of delay time measurement shows 1000 times more sensitive than a method of detectable mass equation to identify the existence of radioactive source remotely. It is the first experimental verification of radioactive material detection using a high power gyrotron. This study shows that a remote detection of radioactive material based on analysis of precise delay time measurement could be feasible by using a high power millimeter/THz wave gyrotron. NRF-2013R1A1A2061062, NRF-2012-Global Ph.D. Fellowship Program.

  10. Writer identification and verification

    NARCIS (Netherlands)

    Schomaker, Lambert; Ratha, N; Govindaraju, V

    2008-01-01

    Writer identification and verification have gained increased interest recently, especially in the fields of forensic document examination and biometrics. Writer identification assigns a handwriting to one writer out of a set of writers. It determines whether or not a given handwritten text has in

  11. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  12. Dramatic Rewritings of the Spanish Golden Age Theater of Cervantes´s La fuerza de la sangre

    Directory of Open Access Journals (Sweden)

    Juan Manuel Escudero Baztán

    2013-12-01

    Full Text Available This paper analyzes the Golden age spanish theater recreations of Cervantes’s exemplary novel La fuerza de la sangre. Specifically, the paper reviews three important stages in these recreations: La fuerza de la sangre of Guillen de Castro, El agravio satisfecho de Castillo Solórzano, and No hay cosa como callar de Calderon de la Barca. Different rewrites indicate a close relationship between the three dramatic texts through intertextuality and other influences.

  13. Toward a multi-scale computational model of arterial adaptation in hypertension: verification of a multi-cell agent based model.

    Science.gov (United States)

    Thorne, Bryan C; Hayenga, Heather N; Humphrey, Jay D; Peirce, Shayn M

    2011-01-01

    Agent-based models (ABMs) represent a novel approach to study and simulate complex mechano chemo-biological responses at the cellular level. Such models have been used to simulate a variety of emergent responses in the vasculature, including angiogenesis and vasculogenesis. Although not used previously to study large vessel adaptations, we submit that ABMs will prove equally useful in such studies when combined with well-established continuum models to form multi-scale models of tissue-level phenomena. In order to couple agent-based and continuum models, however, there is a need to ensure that each model faithfully represents the best data available at the relevant scale and that there is consistency between models under baseline conditions. Toward this end, we describe the development and verification of an ABM of endothelial and smooth muscle cell responses to mechanical stimuli in a large artery. A refined rule-set is proposed based on a broad literature search, a new scoring system for assigning confidence in the rules, and a parameter sensitivity study. To illustrate the utility of these new methods for rule selection, as well as the consistency achieved with continuum-level models, we simulate the behavior of a mouse aorta during homeostasis and in response to both transient and sustained increases in pressure. The simulated responses depend on the altered cellular production of seven key mitogenic, synthetic, and proteolytic biomolecules, which in turn control the turnover of intramural cells and extracellular matrix. These events are responsible for gross changes in vessel wall morphology. This new ABM is shown to be appropriately stable under homeostatic conditions, insensitive to transient elevations in blood pressure, and responsive to increased intramural wall stress in hypertension.

  14. Simulator Semantics for System Level Formal Verification

    Directory of Open Access Journals (Sweden)

    Toni Mancini

    2015-09-01

    Full Text Available Many simulation based Bounded Model Checking approaches to System Level Formal Verification (SLFV have been devised. Typically such approaches exploit the capability of simulators to save computation time by saving and restoring the state of the system under simulation. However, even though such approaches aim to (bounded formal verification, as a matter of fact, the simulator behaviour is not formally modelled and the proof of correctness of the proposed approaches basically relies on the intuitive notion of simulator behaviour. This gap makes it hard to check if the optimisations introduced to speed up the simulation do not actually omit checking relevant behaviours of the system under verification. The aim of this paper is to fill the above gap by presenting a formal semantics for simulators.

  15. Constraint specialisation in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute a speciali......We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute...... underlying the clauses. Experimental results on verification problems show that this is an effective transformation, both in our own verification tools (based on a convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  16. Software Verification of Orion Cockpit Displays

    Science.gov (United States)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  17. Towards Verification and Validation for Increased Autonomy

    Science.gov (United States)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  18. Hollow fiber-based liquid-liquid-liquid micro-extraction with osmosis: I. Theoretical simulation and verification.

    Science.gov (United States)

    Wu, Qian; Wu, Dapeng; Geng, Xuhui; Shen, Zheng; Guan, Yafeng

    2012-07-27

    Osmosis in hollow fiber-based liquid-liquid-liquid micro-extraction (HF-LLLME) was validated and utilized to improve enrichment factor of extraction in this study. When donor phase (sample solution) with higher ion strength than acceptor phase (extraction phase) was used, osmosis was established from acceptor phase, through organic membrane to donor phase. The mass flux expression of analytes across the organic membrane was established based on the convective-diffusive kinetic model, and the kinetic process for HF-LLLME with osmosis was simulated. Simulation results indicated that osmosis from acceptor phase to donor phase can increase enrichment factor of HF-LLLME, accelerate extraction process, and even result in the distribution ratio of analytes between donor and acceptor phase exceeding their partition coefficient. This phenomenon was verified by the experimental data of extraction with six organic acids and four organic bases as the model analytes. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Design and Algorithm Verification of a Gyroscope-Based Inertial Navigation System for Small-Diameter Spaces in Multilateral Horizontal Drilling Applications

    Directory of Open Access Journals (Sweden)

    Tao Li

    2015-12-01

    Full Text Available In the recent years horizontal drilling (HD has become increasingly important in oil and gas exploration because it can increase the production per well and can effectively rework old and marginal vertical wells. The key element of successful HD is accurate navigation of the drill bit with advanced measurement-while-drilling (MWD tools. The size of the MWD tools is not significantly restricted in vertical wells because there is enough space for their installation in traditional well drilling, but the diameter of devices for HD must be restricted to less than 30 mm for some applications, such as lateral drilling from existing horizontal wells. Therefore, it is essential to design miniature devices for lateral HD applications. Additionally, magnetometers in traditional MWD devices are easily susceptible to complex downhole interferences, and gyroscopes have been previously suggested as the best avenue to replace magnetometers for azimuth measurements. The aim of this paper is to propose a miniature gyroscope-based MWD system which is referred to as miniature gyroscope-based while drilling (MGWD system. A prototype of such MGWD system is proposed. The device consists of a two-axis gyroscope and a three-axis accelerometer. Miniaturization design approaches for MGWD are proposed. In addition, MGWD data collection software is designed to provide real-time data display and navigation algorithm verification. A fourth-order autoregressive (AR model is introduced for stochastic noise modeling of the gyroscope and the accelerometer data. Zero velocity and position are injected into a Kalman filter as a system reference to update system states, which can effectively improve the state observability of the MGWD system and decrease estimation errors. Nevertheless, the azimuth of the proposed MGWD system is not observable in the Kalman filter, and reliable azimuth estimation remains a problem.

  20. Towards agent-based modelling and verification of collaborative business processes : An approach centred on interactions and behaviours

    NARCIS (Netherlands)

    Stuit, M.; Szirbik, N.

    2009-01-01

    This paper presents the process-oriented aspects of a formal and visual agent-based business process modeling language. The language is of use for (networks of) organizations that elect or envisage multi-agent systems for the support of collaborative business processes. The paper argues that the

  1. Design and verification of focal plane assembly thermal control system of one space-based astronomy telescope

    Science.gov (United States)

    Yang, Wen-gang; Fan, Xue-wu; Wang, Chen-jie; Wang, Ying-hao; Feng, Liang-jie; Du, Yun-fei; Ren, Guo-rui; Wang, Wei; Li, Chuang; Gao, Wei

    2015-10-01

    One space-based astronomy telescope will observe astronomy objects whose brightness should be lower than 23th magnitude. To ensure the telescope performance, very low system noise requirements need extreme low CCD operating temperature (lower than -65°C). Because the satellite will be launched in a low earth orbit, inevitable space external heat fluxes will result in a high radiator sink temperature (higher than -65°C). Only passive measures can't meet the focal plane cooling specification and active cooling technologies must be utilized. Based on detailed analysis on thermal environment of the telescope and thermal characteristics of focal plane assembly (FPA), active cooling system which is based on thermo-electric cooler (TEC) and heat rejection system (HRS) which is based on flexible heat pipe and radiator have been designed. Power consumption of TECs is dependent on the heat pumped requirements and its hot side temperature. Heat rejection capability of HRS is mainly dependent on the radiator size and temperature. To compromise TEC power consumption and the radiator size requirement, thermal design of FPA must be optimized. Parasitic heat loads on the detector is minimized to reduce the heat pumped demands of TECs and its power consumption. Thermal resistance of heat rejection system is minimized to reject the heat dissipation of TECs from the hot side to the radiator efficiently. The size and surface coating of radiator are optimized to compromise heat reject ion requirements and system constraints. Based on above work, transient thermal analysis of FPA is performed. FPA prototype model has been developed and thermal vacuum/balance test has been accomplished. From the test, temperature of key parts and working parameters of TECs in extreme cases have been acquired. Test results show that CCD can be controlled below -65°C and all parts worked well during the test. All of these verified the thermal design of FPA and some lessons will be presented in this

  2. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    Almost all the studies are based on either National Center for Environmental Prediction (NCEP), USA, final analysis fields (NCEP FNL) or the reanalysis data used as initial and lateral boundary conditions for driving the mesoscale model. Here we present a mesoscale model forecast verification and intercomparison study ...

  3. Thermal-hydraulics verification of a coarse-mesh OpenFOAM-based solver for a Sodium Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Bonet López, M.

    2015-07-01

    Recently, in the Institute Swiss Paul Scherrer Institut, is has developed a platform Multiphysics, based in OpenFOAM, that is capable of performing an analysis multidimensional of a reactor nuclear. One of the main objectives of this project is to verify the part of the code responsible for the Thermo-hydraulic analysis of the reactor. To carry out simulations this part of the code uses the approximation of thick mesh based on the equations of a porous medium. Therefore, the other objective is demonstrate that this method is applicable to the analysis of a reactor nuclear fast of sodium, focusing is in his capacity of predict the transfer of heat between a subset and the space vacuum between subsets of the core of the reactor. (Author)

  4. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2017-11-23

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  5. Verification of pharmacogenetics-based warfarin dosing algorithms in Han-Chinese patients undertaking mechanic heart valve replacement.

    Science.gov (United States)

    Zhao, Li; Chen, Chunxia; Li, Bei; Dong, Li; Guo, Yingqiang; Xiao, Xijun; Zhang, Eryong; Qin, Li

    2014-01-01

    To study the performance of pharmacogenetics-based warfarin dosing algorithms in the initial and the stable warfarin treatment phases in a cohort of Han-Chinese patients undertaking mechanic heart valve replacement. We searched PubMed, Chinese National Knowledge Infrastructure and Wanfang databases for selecting pharmacogenetics-based warfarin dosing models. Patients with mechanic heart valve replacement were consecutively recruited between March 2012 and July 2012. The predicted warfarin dose of each patient was calculated and compared with the observed initial and stable warfarin doses. The percentage of patients whose predicted dose fell within 20% of their actual therapeutic dose (percentage within 20%), and the mean absolute error (MAE) were utilized to evaluate the predictive accuracy of all the selected algorithms. A total of 8 algorithms including Du, Huang, Miao, Wei, Zhang, Lou, Gage, and International Warfarin Pharmacogenetics Consortium (IWPC) model, were tested in 181 patients. The MAE of the Gage, IWPC and 6 Han-Chinese pharmacogenetics-based warfarin dosing algorithms was less than 0.6 mg/day in accuracy and the percentage within 20% exceeded 45% in all of the selected models in both the initial and the stable treatment stages. When patients were stratified according to the warfarin dose range, all of the equations demonstrated better performance in the ideal-dose range (1.88-4.38 mg/day) than the low-dose range (mechanic heart valve replacement.

  6. Transport Mechanisms and Quality Changes During Frying of Chicken Nuggets--Hybrid Mixture Theory Based Modeling and Experimental Verification.

    Science.gov (United States)

    Bansal, Harkirat S; Takhar, Pawan S; Alvarado, Christine Z; Thompson, Leslie D

    2015-12-01

    Hybrid mixture theory (HMT) based 2-scale fluid transport relations of Takhar coupled with a multiphase heat transfer equation were solved to model water, oil and gas movement during frying of chicken nuggets. A chicken nugget was treated as a heterogeneous material consisting of meat core with wheat-based coating. The coupled heat and fluid transfer equations were solved using the finite element method. Numerical simulations resulted in data on spatial and temporal profiles for moisture, rate of evaporation, temperature, oil, pore pressure, pressure in various phases, and coefficient of elasticity. Results showed that most of the oil stayed in the outer 1.5 mm of the coating region. Temperature values greater than 100 °C were observed in the coating after 30 s of frying. Negative gage-pore pressure (p(w) p(g)) in the hydrophilic matrix causes p(w) < p(g), which further results in negative pore pressure. The coefficient of elasticity was the highest at the surface (2.5 × 10(5) Pa) for coating and the interface of coating and core (6 × 10(5) Pa). Kinetics equation for color change obtained from experiments was coupled with the HMT based model to predict the color (L, a, and b) as a function of frying time. © 2015 Institute of Food Technologists®

  7. Detection de faute automatique dans les systemes solaires thermiques basee sur la verification de regles et la simulation

    Science.gov (United States)

    Maltais Larouche, Simon

    Solar hot water systems are often considered to lower the energy costs and greenhouse gas emissions related to the production of domestic hot water. Although the capital costs associated with solar domestic hot water systems are decreasing each year, they are still significantly higher than conventional solutions, and these extra costs are compensated by reduced energy bills. In order to be economically viable, these systems must then deliver a satisfactory performance over their useful lifetime. Unfortunately, it is not uncommon for solar hot water systems to encounter issues which result in a reduction of energy savings and/or their useful life span. These issues often result from poor design, careless installation, and a lack of maintenance. Furthermore, it is frequent that the system's owners stay unaware of a failure for an extended period of time, since these systems are generally equipped with auxiliary heating designed to meet the entire heat load. Thus, a system could be underperforming or out of service for months or even years without any noticeable symptoms from the hot water consumer point of view. In this respect, it is important to find solutions to automatically warn a system's owner or manager in case of a system failure. This thesis present an original automatic fault detection method based on two levels and developed specifically for solar hot water systems. The first level monitors a system's operating conditions (e.g. temperatures, flowrates, pressures, etc.) through a rule-base algorithm. In the second level, the solar circuit and auxiliary heater daily performances are evaluated using TRNSYS simulations and compared with the measured performance in order to determine if there is a significant discrepancy. The method was assessed using three years of operation data from a solar hot water system composed of 11 evacuated tubes of a total area of 35.5 m 2 located at l'Accueil Bonneau in Montreal, Canada. The validation was also used to determine

  8. Rewriting age to overcome misaligned age and gender norms in later life.

    Science.gov (United States)

    Morelock, Jeremiah C; Stokes, Jeffrey E; Moorman, Sara M

    2017-01-01

    In this paper we suggest that older adults undergo a misalignment between societal age norms and personal lived experience, and attempt reconciliation through discursive strategies: They rewrite how they frame chronological age as well as their subjective relations to it. Using a sample of 4041 midlife and older adults from the 2004-2006 wave of the National Survey of Midlife Development in the United States (MIDUS II), we explore associations of age and gender with subjective age and at what age respondents felt people enter later life. Our results confirm that as men and women age, they push up the age at which they think people enter later life, and slow down subjective aging (there is a growing gap between subjective and chronological age). Relations between a person's age and at what age they think people enter later life were stronger for men than for women. For every year they get older get older, men push up when they think people enter later life by 0.24years, women by 0.16years. Age norms surrounding the transition to later life may be more prominent for men than for women, and the difference in their tendencies to push up when they mark entry into later life may be a reflection of this greater prominence. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Jamal Mahjoub’s The Carrier as a Re-writing of Shakespeare’s Othello

    Directory of Open Access Journals (Sweden)

    Yousef Awad

    2017-07-01

    Full Text Available This paper examines how Arab British novelist Jamal Mahjoub appropriates and interpolates Shakespeare’s Othello. Specifically, this paper argues that Mahjoub’s historical novel The Carrier (1998 re-writes Shakespeare’s Othello in a way that enables the novelist to comment on some of the themes that remain unexplored in Shakespeare’s masterpiece. Mahjoub appropriates tropes, motifs and episodes from Shakespeare’s play which include places like Cyprus and Aleppo, Othello’s identity, abusive/foul language, animalistic imagery, and motifs like the eye, sorcery/witchcraft, the storm and adventurous travels. Unlike Othello’s fabled and mythical travels and adventures, Mahjoub renders Rashid al-Kenzy’s as realistic and true to life in a way that highlights his vulnerability. In addition, the ill-fated marriage between Othello and Desdemona is adapted in Mahjoub’s novel in the form of a Platonic love that is founded on a scientific dialogue between Rashid al-Kenzy and Sigrid Heinesen, a poet and philosopher woman from Jutland. In this way, Desdemona’s claim that she sees Othello’s visage in his mind, a claim that is strongly undermined by Othello’s irrationality, jealousy and belief in superstitions during the course of the play, is emphasized and foregrounded in Mahjoub’s novel.

  10. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  11. Formal verification of an oral messages algorithm for interactive consistency

    Science.gov (United States)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  12. Model-based design and experimental verification of a monitoring concept for an active-active electromechanical aileron actuation system

    Science.gov (United States)

    Arriola, David; Thielecke, Frank

    2017-09-01

    Electromechanical actuators have become a key technology for the onset of power-by-wire flight control systems in the next generation of commercial aircraft. The design of robust control and monitoring functions for these devices capable to mitigate the effects of safety-critical faults is essential in order to achieve the required level of fault tolerance. A primary flight control system comprising two electromechanical actuators nominally operating in active-active mode is considered. A set of five signal-based monitoring functions are designed using a detailed model of the system under consideration which includes non-linear parasitic effects, measurement and data acquisition effects, and actuator faults. Robust detection thresholds are determined based on the analysis of parametric and input uncertainties. The designed monitoring functions are verified experimentally and by simulation through the injection of faults in the validated model and in a test-rig suited to the actuation system under consideration, respectively. They guarantee a robust and efficient fault detection and isolation with a low risk of false alarms, additionally enabling the correct reconfiguration of the system for an enhanced operational availability. In 98% of the performed experiments and simulations, the correct faults were detected and confirmed within the time objectives set.

  13. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  14. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  15. GRAVITY Science Verification

    Science.gov (United States)

    Mérand, A.; Berger, J.-P.; de Wit, W.-J.; Eisenhauer, F.; Haubois, X.; Paumard, T.; Schoeller, M.; Wittkowski, M.; Woillez, J.; Wolff, B.

    2017-12-01

    In the time between successfully commissioning an instrument and before offering it in the Call for Proposals for the first time, ESO gives the community at large an opportunity to apply for short Science Verification (SV) programmes. In 2016, ESO offered SV time for the second-generation Very Large Telescope Interferometer instrument GRAVITY. In this article we describe the selection process, outline the range of science cases covered by the approved SV programmes, and highlight some of the early scientific results.

  16. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    Science.gov (United States)

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  17. Verification of Radicals Formation in Ethanol-Water Mixture Based Solution Plasma and Their Relation to the Rate of Reaction.

    Science.gov (United States)

    Sudare, Tomohito; Ueno, Tomonaga; Watthanaphanit, Anyarat; Saito, Nagahiro

    2015-12-03

    Our previous research demonstrated that using ethanol-water mixture as a liquid medium for the synthesis of gold nanoparticles by the solution plasma process (SPP) could lead to an increment of the reaction rate of ∼35.2 times faster than that in pure water. This drastic change was observed when a small amount of ethanol, that is, at an ethanol mole fraction (χethanol) of 0.089, was added in the system. After this composition, the reaction rate decreased continuously. To better understand what happens in the ethanol-water mixture-based SPP, in this study, effect of the ethanol content on the radical formation in the system was verified. We focused on detecting the magnetic resonance of electronic spins using electron spin resonance spectroscopy to determine the type and quantity of the generated radicals at each χethanol. Results indicated that ethanol radicals were generated in the ethanol-water mixtures and exhibited maximum quantity at the xethanol of 0.089. Relationship between the ethanol radical yield and the rate of reaction, along with possible mechanism responsible for the observed phenomenon, is discussed in this paper.

  18. Experimental verification of sensing capability of an electromagnetic induction system for an MR fluid damper-based control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H J; Jang, D D [Department of Civil and Environmental Engineering, KAIST, 305-701, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Cho, S W [Samsung SDS Co., Ltd., Yeoksam-dong, Gangnam-gu, Seoul 135-918 (Korea, Republic of); Koo, J H [Department of Mechanical and Manufacturing Engineering, Miami University, Oxford, Ohio 45056 (United States)], E-mail: hjung@kaist.ac.kr

    2009-02-01

    This paper investigates the sensing capability of an Electromagnetic Induction (EMI) system that is incorporated in a vibration control system based on MR fluid dampers. The EMI system, consisting of permanent magnets and coils, converts reciprocal motions (kinetic energy) of MR damper into electrical energy (electromotive force or emf). According to the Faraday's law of electromagnetic induction, the emf signal, produced from the EMI, is proportional to the velocity of the motion. Thus, the induced voltage (emf) signal is able to provide the necessary measurement information (i.e., relative velocity across the damper). In other words, the EMI can act as a sensor in the MR damper system. In order to evaluate the proposed concept of the EMI sensor, an EMI system was constructed and integrated into an MR damper system. The emf signal is experimentally compared with the velocity signal by conducting a series of shaking table tests. The results show that the induced emf voltage signal well agreed with the relative velocity.

  19. AFLATOXIN B1 IN CORN: DIRECT VERIFICATION OF CONTAMINATION THROUGH AN AUTOMATIC COMPUTERIZED SYSTEM BASED ON THE FLUORESCENCE

    Directory of Open Access Journals (Sweden)

    I. Dragoni

    2009-09-01

    Full Text Available “Aflaflesh” is a computer based instrument, designed combining a visual data acquisition system with a sophisticated software of acquisition and analysis of images. This system allows you to check on a representative sample (5/10 kg contamination of corn by AFB1, using fluorescence under UV light when the grain is contaminated. To optimize the use of this control equipment were analyzed in two phases, a total of 80 samples comparing the results obtained by chemical analysis (Hplc to those obtained using “Aflaflesh”. Initially the study was set to correlate the number of contaminated grains to the ppb read by the official method, Hplc; the second step was to correlate ppb values to the number of pixel of contaminated surface of the grains read by the “Aflaflesh” instrument. The apparatus was then calibrated through a statistical analysis of the results obtained, to allow a direct reading of the AFB1 concentrations in a short period of time (15 min without the assistance of specialized personnel.

  20. Model of Flood Wave Propagation on the Caspian Sea and its Verification Based on the Satellite Altimetry Data

    Science.gov (United States)

    Lebedev, S. A.

    2010-12-01

    In this research simple flood wave propagation model was based the Saint-Venant equations represented a good way to describe problems concerning with flood waves propagations in open channels. For solution of this task the Caspian Sea was approximated as channel with a rectangular cross-section. Channel axis coincided with the sea longitudinal axis or location of descending pass 092 of satellites TOPEX/Poseidon (T/P) and Jason- 1/2 (J1/2). Altimetric measurements of this satellites permit to define more exactly empiric parameters of the flood wave (propagation speed amplitude et al.) which are solution of the model. Also it allows estimating of effective evaporation. In this approach it is possible to consider as an integrated difference between sea surface heights between previous and the subsequent cycles altimetric measurements. Results of calculations have confirmed well conformity given calculated by other researchers and the model. As is shown than interannual variability of flood wave speed in the North Caspian was well correlated with interannual the Caspian Sea level variability.

  1. Building a multi-FPGA-based emulation framework to support networks-on-chip design and verification

    Science.gov (United States)

    Liu, Yangfan; Liu, Peng; Jiang, Yingtao; Yang, Mei; Wu, Kejun; Wang, Weidong; Yao, Qingdong

    2010-10-01

    In this article, we present a highly scalable, flexible hardware-based network-on-chip (NoC) emulation framework, through which NoCs built upon various types of network topologies, routing algorithms, switching protocols and flow control schemes can be explored, compared, and validated with injected or self-generated traffic from both real-life and synthetic applications. This high degree of scalability and flexibility is achieved due to the field programmable gate array (FPGA) design choices made at both functional and physical levels. At the functional level, a NoC system to be emulated can be partitioned into two parts: (i) the processing cores and (ii) the network. Each part is mapped onto a different FPGA so that when there is any change to be made to any one of these parts, only the corresponding FPGA needs to be reconfigured and the rest of the FPGAs will be left untouched. At the physical level, two levels of interconnects are adopted to mimic NoC on-chip communications: high bandwidth and low latency parallel on-board wires, and high-speed serial multigigabit transceivers available in FPGAs. The latter is particularly important as it helps the proposed NoC emulation platform scale well with the size increase of the NoCs.

  2. Gender verification of female athletes.

    Science.gov (United States)

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A

    2000-01-01

    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on

  3. Verification of organ doses calculated by a dose monitoring software tool based on Monte Carlo Simulation in thoracic CT protocols.

    Science.gov (United States)

    Guberina, Nika; Suntharalingam, Saravanabavaan; Naßenstein, Kai; Forsting, Michael; Theysohn, Jens; Wetter, Axel; Ringelstein, Adrian

    2017-01-01

    Background The importance of monitoring of the radiation dose received by the human body during computed tomography (CT) examinations is not negligible. Several dose-monitoring software tools emerged in order to monitor and control dose distribution during CT examinations. Some software tools incorporate Monte Carlo Simulation (MCS) and allow calculation of effective dose and organ dose apart from standard dose descriptors. Purpose To verify the results of a dose-monitoring software tool based on MCS in assessment of effective and organ doses in thoracic CT protocols. Material and Methods Phantom measurements were performed with thermoluminescent dosimeters (TLD LiF:Mg,Ti) using two different thoracic CT protocols of the clinical routine: (I) standard CT thorax (CTT); and (II) CTT with high-pitch mode, P = 3.2. Radiation doses estimated with MCS and measured with TLDs were compared. Results Inter-modality comparison showed an excellent correlation between MCS-simulated and TLD-measured doses ((I) after localizer correction r = 0.81; (II) r = 0.87). The following effective and organ doses were determined: (I) (a) effective dose = MCS 1.2 mSv, TLD 1.3 mSv; (b) thyroid gland = MCS 2.8 mGy, TLD 2.5 mGy; (c) thymus = MCS 3.1 mGy, TLD 2.5 mGy; (d) bone marrow = MCS 0.8 mGy, TLD 0.9 mGy; (e) breast = MCS 2.5 mGy, TLD 2.2 mGy; (f) lung = MCS 2.8 mGy, TLD 2.7 mGy; (II) (a) effective dose = MCS 0.6 mSv, TLD 0.7 mSv; (b) thyroid gland = MCS 1.4 mGy, TLD 1.8 mGy; (c) thymus = MCS 1.4 mGy, TLD 1.8 mGy; (d) bone marrow = MCS 0.4 mGy, TLD 0.5 mGy; (e) breast = MCS 1.1 mGy, TLD 1.1 mGy; (f) lung = MCS 1.2 mGy, TLD 1.3 mGy. Conclusion Overall, in thoracic CT protocols, organ doses simulated by the dose-monitoring software tool were coherent to those measured by TLDs. Despite some challenges, the dose-monitoring software was capable of an accurate dose calculation.

  4. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  5. Application of the cloze procedure to evaluate comprehension and demonstrate rewriting of pharmacy educational materials.

    Science.gov (United States)

    Miller, Michael J; DeWitt, Jane E; McCleeary, Erin M; O'Keefe, Kelly J

    2009-04-01

    Written materials are commonly used to communicate pharmacy-relevant information to patients. However, they are often composed at a level that limits comprehension, mitigating a well-intended effect. To (1) use the cloze procedure (a test designed to assess reading comprehension) to evaluate an individual's understanding of a pharmacy-relevant educational pamphlet; (2) compare results of the cloze procedure with the reading comprehension component of the Short Test of Functional Health Literacy in Adults (S-TOFHLA); and (3) use results to demonstrate rewriting of the educational pamphlet. The cloze procedure was applied to a pharmacy-relevant educational pamphlet describing safe medication practices. A total of 162 subjects were recruited from university faculty, staff, and students; a local adult literacy center; and community senior centers. Subjects completed a background interview, the S-TOFHLA, and cloze procedure for the pharmacy-relevant educational pamphlet. S-TOFHLA and cloze procedure scores were described and compared. Cloze procedure responses were used to demonstrate revision of the pamphlet. Of the 154 subjects analyzed, mean +/- SD age was 56.5 +/- 20.4 years. Subjects were predominantly white (93.5%), female (71.4%), and college graduates (42.2%). Mean score on the S-TOFHLA was 92.1%. A majority (95.5%, 147/154) of subjects demonstrated adequate functional health literacy. In contrast, mean score on the cloze procedure was 53.3%. Internal consistencies of the S-TOFHLA and the cloze procedure were 0.92 and 0.90, respectively. Scores on the cloze procedure and the S-TOFHLA were highly correlated (r = 0.71, p educated, health-literate sample, a majority did not understand the pharmacy-relevant educational pamphlet despite adequate performance on a standard measure of health literacy. The cloze procedure can be used to assess comprehension of educational materials, solicit feedback from intended users, and guide the revision of educational materials.

  6. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  7. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F [UniversityKansas Medical Center, Kansas City, KS (United States)

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  8. Task-specific style verification

    Science.gov (United States)

    Pataki, Norbert; Cséri, Tamás; Szügyi, Zalán

    2012-09-01

    Programming antipatterns are commonly used patterns that make the code unnecessary complex and unmaintainable. However, beginner programmers such as students, often use them. Usage of antipatterns should be eliminated from source code. Many antipatterns can be detected at compilation-time with an appropriate parser tool. In this paper we argue for a new lint-like tool that does detect typical programming antipatterns, and it is extensible to task-specific verifications. This tool mainly developed to evaluate students' programs, however it can be used in industrial projects as well. Our approach based on pattern matching on abstract syntax tree provided by Clang parser. We present our description language that specifies the antipatterns.

  9. Formal Verification for a Next-Generation Space Shuttle

    Science.gov (United States)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  10. Image-guided method for TLD-based in vivo rectal dose verification with endorectal balloon in proton therapy for prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Hsi, Wen C.; Fagundes, Marcio; Zeidan, Omar [ProCure Proton Therapy Center, Oklahoma City, Oklahoma 73142 (United States); Hug, Eugen [ProCure Proton Therapy Centers, New York, New York 10016 (United States); Schreuder, Niek [ProCure Training and Development Center, Bloomington, Indiana 47404 (United States)

    2013-05-15

    Purpose: To present a practical image-guided method to position an endorectal balloon that improves in vivo thermoluminiscent dosimeter (TLD) measurements of rectal doses in proton therapy for prostate cancer. Methods: TLDs were combined with endorectal balloons to measure dose at the anterior rectal wall during daily proton treatment delivery. Radiopaque metallic markers were employed as surrogates for balloon position reproducibility in rotation and translation. The markers were utilized to guide the balloon orientation during daily treatment employing orthogonal x-ray image-guided patient positioning. TLDs were placed at the 12 o'clock position on the anterior balloon surface at the midprostatic plane. Markers were placed at the 3 and 9 o'clock positions on the balloon to align it with respect to the planned orientation. The balloon rotation along its stem axis, referred to as roll, causes TLD displacement along the anterior-posterior direction. The magnitude of TLD displacement is revealed by the separation distance between markers at opposite sides of the balloon on sagittal x-ray images. Results: A total of 81 in vivo TLD measurements were performed on six patients. Eighty-three percent of all measurements (65 TLD readings) were within +5% and -10% of the planning dose with a mean of -2.1% and a standard deviation of 3.5%. Examination of marker positions with in-room x-ray images of measured doses between -10% and -20% of the planned dose revealed a strong correlation between balloon roll and TLD displacement posteriorly from the planned position. The magnitude of the roll was confirmed by separations of 10-20 mm between the markers which could be corrected by manually adjusting the balloon position and verified by a repeat x-ray image prior to proton delivery. This approach could properly correct the balloon roll, resulting in TLD positioning within 2 mm along the anterior-posterior direction. Conclusions: Our results show that image-guided TLD-based

  11. Acridizinium-Substituted Dendrimers As a New Potential Rewritable Optical Data Storage Material for Blu-ray

    DEFF Research Database (Denmark)

    Lohse, Brian; Vestberg, Robert; Ivanov, Mario T.

    2008-01-01

    . This provides an alternative chromophore for rewritable optical data storage media to the existing dye materials such as azo, cyanine, and phthalocyanine dyes for Blu-ray recording. The compound was initially tested in ethanol, showing good reversible properties and photoinduced degree of dimerization....... The (acridizinium) 12-bis-MPA deIldrimer was cast on a quartz plate, using poly(vinylpyrrolidone) as a matrix, in order to simulate conditions found in DVD discs for existing dyes. The film showed good transmission, stability, and mechanical properties. Through gray scale recording it may be possible to store more...

  12. Verification and enhancement high resolution layers 2012 for Bulgaria

    Science.gov (United States)

    Dimitrov, Ventzeslav; Lubenov, Todor

    Production of high-resolution layers (HRL) is a substantial part of the pan-European component of GMES/Copernicus initial operations (GIO) land monitoring service. The focus of this paper is on the results of the implementation of HRL verification and enhancement tasks for Bulgarian territory. For the reference year 2012 five HRL on land cover characteristics were produced by service providers through sophisticated classification of multi-sensor and multi-temporal satellite images: imperviousness, forests, grasslands, wetlands and permanent water bodies. As a result of the verification systematic classification errors were identified relevant to the subsequent enhancement procedure. The verification was carried out through visual inspection of stratified samples in the HRL using reliable reference spatial data sets, checking for commission and omission errors. The applied procedure included three major parts, the first two - obligatory: general overview of data quality, look-and-feel control of critical strata and statistically based quantitative verification. The enhancement task consisted in correcting errors revealed by the verification giving as a result final enhanced HRL products. Stratification schemes, evaluation grades by strata and HRL from look-and-feel verification and accuracy values from statistical verification are presented. Types and quantities of removed mistakes during the enhancement are structured and summarised. Results show that all HRL except the grasslands layer meet the 85% accuracy requirements.

  13. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  14. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  15. Formal Verification of Distributed Algorithms (Dagstuhl Seminar 13141)

    OpenAIRE

    Charron-Bost, Bernadette; Merz, Stephan; Rybalchenko, Andrey; Widder, Josef

    2013-01-01

    The Dagstuhl Seminar 13141 "Formal Verification of Distributed Algorithms" brought together researchers from the areas of distributed algorithms, model checking, and semi-automated proofs with the goal to establish a common base for approaching the many open problems in verification of distributed algorithms. In order to tighten the gap between the involved communities, who have been quite separated in the past, the program contained tutorials on the basics of the concerned fields. In addi...

  16. Secure Image Hash Comparison for Warhead Verification

    Energy Technology Data Exchange (ETDEWEB)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  17. Overview of Code Verification

    Science.gov (United States)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  18. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  19. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  20. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    methods in the verification task. Today formal verification is finding increasing acceptance ... approaches that are major research issues in formal verification research today. There are four articles in this issue, which show up the different flavours in the approach to formal methods in verification. The first paper by Supratik ...

  1. Formal Verification, Engineering and Business Value

    Directory of Open Access Journals (Sweden)

    Ralf Huuck

    2012-12-01

    Full Text Available How to apply automated verification technology such as model checking and static program analysis to millions of lines of embedded C/C++ code? How to package this technology in a way that it can be used by software developers and engineers, who might have no background in formal verification? And how to convince business managers to actually pay for such a software? This work addresses a number of those questions. Based on our own experience on developing and distributing the Goanna source code analyzer for detecting software bugs and security vulnerabilities in C/C++ code, we explain the underlying technology of model checking, static analysis and SMT solving, steps involved in creating industrial-proof tools.

  2. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  3. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  4. Spatial Verification Using Wavelet Transforms: A Review

    CERN Document Server

    Weniger, Michael; Friederichs, Petra

    2016-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...

  5. Program verification using symbolic game semantics

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar

    2014-01-01

    We introduce a new symbolic representation of algorithmic game semantics, and show how it can be applied for efficient verification of open (incomplete) programs. The focus is on an Algol-like programming language which contains the core ingredients of imperative and functional languages...... of game semantics to that of corresponding symbolic representations. In this way programs with infinite data types, such as integers, can be expressed as finite-state symbolic-automata although the standard automata representation is infinite-state, i.e. the standard regular-language representation has...... infinite summations. Moreover, in this way significant reductions of the state space of game semantics models are obtained. This enables efficient verification of programs by our prototype tool based on symbolic game models, which is illustrated with several examples....

  6. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  7. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  8. Multiple-wavelength double random phase encoding with CCD-plane sparse-phase multiplexing for optical information verification.

    Science.gov (United States)

    Chen, Wen

    2015-12-20

    A novel method is proposed by using multiple-wavelength double random phase encoding (MW-DRPE) with CCD-plane sparse-phase multiplexing for optical information verification. Two different strategies are applied to conduct sparse-phase multiplexing in the CCD plane. The results demonstrate that large capacity can be achieved for optical multiple-image verification. The proposed optical verification strategy is implemented based on optical encoding, and the keys generated by optical encryption can further guarantee the safety of the designed optical multiple-image verification system. The proposed method provides a novel alternative for DRPE-based optical information verification.

  9. Rewriting the Matrix of Life. Biomedia Between Ecological Crisis and Playful Actions

    Directory of Open Access Journals (Sweden)

    Christoph Neubert, Serjoscha Wiemer

    2014-09-01

    Full Text Available The paper discusses concepts of ‘nature’ and ‘life’ as subjected to historical changes. The 21st century seems to be obsessed with ‘life’ and ‘nature’, which are reconfigured as objects of simulation practices and of a multitude of technoscientific enterprises as well as of political struggle. The historical influences and epistemological shifts of systems thinking are significant within two distinctive and interwoven fields: On the one hand the discourse of environmentalism with the paradigm of ecological crises, centered around ideas of resource management, sustainability, the general idea of an ‘endangered nature’ and the interconnectedness of global politics and individual actions. On the other hand the optimistic promises of artificial life, with synthetic biology and digital cyborg technologies as its avantgarde, which are very much driven by the idea of technoscientific mastery to surpass natures ‘weakness’ and by desires to improve ‘life’ and to even refashion ‘life itself’. On the field of historical ecology, concepts of systems thinking are traced back to the middle of the 19th century, where ecological thought emerged at the intersections of biology and geography. Meandering between vitalistic, holistic, and mechanistic concepts, between living and non-living elements, systems ecology finally substitutes ‘nature’, which in turn is re-established in its new ‘gestalt’ as computer simulated world model since the early 1970s. Resurrected as an interrelation of system variables at the level of global simulations ‘nature’ strikes as a zombie. As a second turning point of the rewriting of the matrix, of life we will discuss the advance of ‘games’ since the early 1970ies, with the example of ‘Game of life’ (‘Life’ as a significant landmark. When ‘life’ becomes ‘Life’, it is by computerized modeling in terms of dynamic processes. Computer games can be thought of as instances of

  10. "If you thought this story sour, sweeten it with your own telling" - a feminist poetics of rewriting in Susan Price's Ghost dance

    Directory of Open Access Journals (Sweden)

    Sanna Lehtonen

    2010-01-01

    Full Text Available The attempts to challenge conventional gendered discourses in children's fantasy have often resulted in feminist rewritings of earlier stories. Ghost dance (1994 by the English author Susan Price is a novel that reflects a specific feminist poetics of rewriting: metafictional passages highlight the constructedness of the narrative and at the end readers are invited to tell their own versions of the story. Moreover, the rewriting freely combines and recontextualises elements from different source texts and reformulates them to create a narrative that challenges conventional discourses of gender. While this poetics has an appeal from a feminist perspective, the play with cross-cultural intertexts and gender becomes more complex when the novel is examined in a postcolonialist framework in relation to ethnicity and the issue of cultural appropriation. Ghost dance is situated in a setting that has a real-world equivalent (Russia, involves characters that are identified with names of real-world ethnic groups (Lapps (Sámi, Russian, and mixes elements from Russian wonder tales, Nordic mythology and an Ojibwe legend. The novel does not aim at historical accuracy in its representations nor is it a direct retelling of any of the pre-texts but combines motifs, themes, names, characters and settings freely from each source. In this textual melting pot, the protagonist Shingebiss is, on one level, a revision of the witch Baba Yaga, but also described as a Lappish shaman with an Ojibwe name. To rewrite gendered discourses, certain elements from the pretexts are chosen and others left out – the question is, then, what effects does this recontextualisation have on the representation of ethnicity? Or, are the feminist rewriting strategies actually a form of cultural appropriation?

  11. Formal Verification of Architectural Patterns in Support of Dependable Distributed Systems

    Science.gov (United States)

    2005-07-01

    JUL 2005 2. REPORT TYPE 3. DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Extended Abstract: Formal Verification of Architectural...Rev. 8-98) Prescribed by ANSI Std Z39-18 Extended Abstract: Formal Verification of Architectural Patterns in Support of Dependable Distributed...itd.nrl.navy.mil Keywords: Architectural patterns, dependable software, component-based development, formal verification . 1. Introduction Building

  12. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    Science.gov (United States)

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  13. Adopting model checking techniques for clinical guidelines verification.

    Science.gov (United States)

    Bottrighi, Alessio; Giordano, Laura; Molino, Gianpaolo; Montani, Stefania; Terenziani, Paolo; Torchio, Mauro

    2010-01-01

    Clinical guidelines (GLs) are assuming a major role in the medical area, in order to grant the quality of the medical assistance and to optimize medical treatments within healthcare organizations. The verification of properties of the GL (e.g., the verification of GL correctness with respect to several criteria) is a demanding task, which may be enhanced through the adoption of advanced Artificial Intelligence techniques. In this paper, we propose a general and flexible approach to address such a task. Our approach to GL verification is based on the integration of a computerized GL management system with a model-checker. We propose a general methodology, and we instantiate it by loosely coupling GLARE, our system for acquiring, representing and executing GLs, with the model-checker SPIN. We have carried out an in-depth analysis of the types of properties that can be effectively verified using our approach, and we have completed an overview of the usefulness of the verification task at the different stages of the GL life-cycle. In particular, experimentation on a GL for ischemic stroke has shown that the automatic verification of properties in the model checking approach is able to discover inconsistencies in the GL that cannot be detected in advance by hand. Our approach thus represents a further step in the direction of general and flexible automated GL verification, which also meets usability requirements. 2009 Elsevier B.V. All rights reserved.

  14. De la primera lengua a la traducción literaria: Itinerarios de evaluación y reescritura creativa / From mother tongue teaching to literary translation: Assessing and creative rewriting

    Directory of Open Access Journals (Sweden)

    Jorge J. SÁNCHEZ IGLESIAS

    2012-03-01

    Full Text Available Esta propuesta parte de las particularidades de la formación en Lengua A para Traductores, un ámbito poco definido y normalmente muy alejado de los contenidos literarios. Se plantean dos conjuntos de actividades, vinculados a la evaluación y a la reescritura, de cuya vinculación surgen los conceptos de lectura intencional y desautomatización de la escritura, y que parecen por tanto especialmente adecuadas para fomentar una creatividad en el uso de la lengua que resulta de la mayor importancia para la traducción literaria. A partir de unas primeras experimentaciones con reescritura de textos literarios, se puede concluir que las nociones de tono y estilo están intuitivamente disponibles para los escritores noveles y son por tanto excelentes candidatas para ser operativas en la formación en traducción literaria.  The distinguishing characteristics of First Language training - a vaguely defined area of study which is not usually explored in literary publications - will form the basis of our investigation.  We will consider two sets of activities, both linked to evaluation and rewriting, which will allow us to explore the concepts of intentional reading and deautomisation of the writing process. These notions seem especially useful for encouraging creativity in language use, a highly important skill in literary translation. Based on the results of a few initial experiments involving the rewriting of literary texts, we can conclude that novice writers are intuitively aware of tone and style. These notions could therefore provide an excellent focus for literary translation training. 

  15. ALMA Science Verification

    Science.gov (United States)

    Hills, R.

    2011-01-01

    As many of you are aware, ALMA has reached a very exciting point in the construction phase. After a year of testing the basic functionality of antennas and small arrays at the Chajnantor site at 5000m, we are now able to run full observations of scientific targets using at least 8 antennas and 4 receiver bands. We recently had a series of reviews of all aspects of the ALMA Project, resulting in a consensus that we will be ready to issue a Call for Proposals for Early Science projects at the end of the first quarter of 2011, with an expectation of starting these Early Science observations toward the end of 2011. ALMA Science Verification is the process by which we will demonstrate that the data that will be produced by ALMA during Early Science is valid. This is done by running full "end to end" tests of ALMA as a telescope. We will observe objects for which similar data are already available for other telescopes. This allows us to make direct quantitative comparisons of all aspects of the data cubes, in order to determine whether the ALMA instrumentation or software is introducing any artifacts.

  16. Tags and seals for arms control verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1990-09-18

    Tags and seals have long been recognized as important tools in arms control. The trend in control of armaments is to limit militarily significant equipment that is capable of being verified through direct and cooperative means, chiefly on-site inspection or monitoring. Although this paper will focus on the CFE treaty, the role of tags and seals for other treaties will also be addressed. Published technology and concepts will be reviewed, based on open sources. Arms control verification tags are defined as unique identifiers designed to be tamper-revealing; in that respect, seals are similar, being used as indicators of unauthorized access. Tamper-revealing tags might be considered as single-point markers, seals as two-point couplings, and nets as volume containment. The functions of an arms control tag can be considered to be two-fold: to provide field verification of the identity of a treaty-limited item (TLI), and to have a means of authentication of the tag and its tamper-revealing features. Authentication could take place in the field or be completed elsewhere. For CFE, the goal of tags and seals can be to reduce the overall cost of the entire verification system.

  17. VERIFICATION OF STATISTICAL CLOUDINESS ESTIMATIONS FOR EUROPE

    Directory of Open Access Journals (Sweden)

    Zoltán Imecs

    2012-03-01

    Full Text Available Verification of statistical cloudiness estimations for Europe. The climate forcing induced by cloud cover consists one of the main doubtful aspect of climate change predictions. In the case of cloudiness even the sign of the trends are not cohesive in a given region. In this sense further investigation regarding the behavior of cloudiness are indicated. In this study a statistical estimation of total cloudiness is elaborated using the method of instrumental variables. For this analyze surface-observed monthly mean cloudiness data was applied for the period of 1973-1996. In the second part of the study the verification of results is established using an independent satellite retrieved data series for the period of 2005-2011. Based on verification can be conclude that the applied statistical estimation is able to reproduce the measured values with an RMSE 7, 3%, the difference between the measured and predicted changes of cloudiness is 1.44%, found a stronger decrease of cloudiness in real data as the estimation had indicate. The main differences between the observed and predicted value is evident in the distribution of the frequencies showing a shifting towards the lower values in observed data but not recognized in the estimated values. In the geographical distribution of estimations errors sign a difference is detected between the water surfaces and continental regions.

  18. Runtime Verification of C Memory Safety

    Science.gov (United States)

    Roşu, Grigore; Schulte, Wolfram; Şerbănuţă, Traian Florin

    C is the most widely used imperative system’s implementation language. While C provides types and high-level abstractions, its design goal has been to provide highest performance which often requires low-level access to memory. As a consequence C supports arbitrary pointer arithmetic, casting, and explicit allocation and deallocation. These operations are difficult to use, resulting in programs that often have software bugs like buffer overflows and dangling pointers that cause security vulnerabilities. We say a C program is memory safe, if at runtime it never goes wrong with such a memory access error. Based on standards for writing “good” C code, this paper proposes strong memory safety as the least restrictive formal definition of memory safety amenable for runtime verification. We show that although verification of memory safety is in general undecidable, even when restricted to closed, terminating programs, runtime verification of strong memory safety is a decision procedure for this class of programs. We verify strong memory safety of a program by executing the program using a symbolic, deterministic definition of the dynamic semantics. A prototype implementation of these ideas shows the feasibility of this approach.

  19. Verification of micro-beam irradiation

    Science.gov (United States)

    Li, Qiongge; Juang, Titania; Beth, Rachel; Chang, Sha; Oldham, Mark

    2015-01-01

    Micro-beam Radiation Therapy (MRT) is an experimental radiation therapy with provocative experimental data indicating potential for improved efficacy in some diseases. Here we demonstrated a comprehensive micro-beam verification method utilizing high resolution (50pm) PRESAGE/Micro-Optical-CT 3D Dosimetry. A small PRESAGE cylindrical dosimeter was irradiated by a novel compact Carbon-Nano-Tube (CNT) field emission based MRT system. The Percentage Depth Dose (PDD), Peak-to-Valley Dose Ratio (PVDR) and beam width (FWHM) data were obtained and analyzed from a three strips radiation experiment. A fast dose drop-off with depth, a preserved beam width with depth (an averaged FWHM across three beams remains constant (405.3um, sigma=13.2um) between depth of 3.0~14.0mm), and a high PVDR value (increases with depth from 6.3 at 3.0mm depth to 8.6 at 14.0mm depth) were discovered during this verification process. Some operating procedures such as precise dosimeter mounting, robust mechanical motions (especially rotation) and stray-light artifact management were optimized and developed to achieve a more accurate and dosimetric verification method.

  20. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  1. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  2. The fairy tale: recent interpretations, female characters and contemporary rewriting. Considerations about an “irresistible” genre

    Directory of Open Access Journals (Sweden)

    Susanna Barsotti

    2015-07-01

    Full Text Available Since the ancient times, the fairy tale manages to catch the imagination of human beings everywhere in the world. Its appeal comes to us even thanks to reinterpretations, constant contaminations from different media, from the oral writing, from cinema to theatre, from advertising to animation. This article will highlight the key features of a very much analysed genre, after the most recent studies and will follow it also by referring to the new routes that it has embarked on in our time. Special attention is then paid on the woman presence and on her origin and evolution that she, in her most varied personifications – innocent girl persecuted by the fairy and by the witch – has suffered up to the latest rewriting of the fairy tales.

  3. FCJ-180 Spotify has Added an Event to your Past: (Rewriting the Self through Facebook’s Autoposting Apps

    Directory of Open Access Journals (Sweden)

    Tanya Kant

    2015-08-01

    Full Text Available Drawing on in-depth interviews with sixteen Facebook users, this paper presents a series of vignettes that explore cross-platform Facebook apps as ‘tools’ for self-writing, self-expression and identity performance. The paper argues that the capacity of apps to write in the user’s stead – at times without the user’s knowledge or explicit consent – works to intervene in and on occasion disrupt users’ staged self-performances to their ‘invisible audience’ (Sauter, 2013 on Facebook. Furthermore, if such instances of automated self-writing are treated as performative, apps hold the constitutional capacity to actively rewrite, regulate and even constitute the self to suit the logic of the ‘like economy’ (Gelitz and Helmond, 2013, in ways that transcend the boundaries of Facebook.

  4. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  5. Predicting SMT Solver Performance for Software Verification

    Directory of Open Access Journals (Sweden)

    Andrew Healy

    2017-01-01

    Full Text Available The Why3 IDE and verification system facilitates the use of a wide range of Satisfiability Modulo Theories (SMT solvers through a driver-based architecture. We present Where4: a portfolio-based approach to discharge Why3 proof obligations. We use data analysis and machine learning techniques on static metrics derived from program source code. Our approach benefits software engineers by providing a single utility to delegate proof obligations to the solvers most likely to return a useful result. It does this in a time-efficient way using existing Why3 and solver installations - without requiring low-level knowledge about SMT solver operation from the user.

  6. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  7. Fifty years of progress in speaker verification

    Science.gov (United States)

    Rosenberg, Aaron E.

    2004-10-01

    The modern era in speaker recognition started about 50 years ago at Bell Laboratories with the controversial invention of the voiceprint technique for speaker identification based on expert analysis of speech spectrograms. Early speaker recognition research concentrated on finding acoustic-phonetic features effective in discriminating speakers. The first truly automatic text dependent speaker verification systems were based on time contours or templates of speaker specific acoustic features. An important element of these systems was the ability to time warp sample templates with model templates in order to provide useful comparisons. Most modern text dependent speaker verification systems are based on statistical representations of acoustic features analyzed as a function of time over specified utterances, most particularly the hidden markov model (HMM) representation. Modern text independent systems are based on vector quantization representations and, more recently, on Gaussian mixture model (GMM) representations. An important ingredient of statistically based systems is likelihood ratio decision techniques making use of speaker background models. Some recent research has shown how to extract higher level features based on speaking behavior and combine it with lower level, acoustic features for improved performance. The talk will present these topics in historical order showing the evolution of techniques.

  8. SU-F-T-287: A Preliminary Study On Patient Specific VMAT Verification Using a Phosphor-Screen Based Geometric QA System (Raven QA)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Yi, B [University of Maryland School of Medicine, Baltimore, MD (United States); Wong, J; Ding, K [Johns Hopkins University, Baltimore, MD (United States)

    2016-06-15

    Purpose: The RavenQA system (LAP Laser, Germany) is a QA device with a phosphor screen detector for performing the QA tasks of TG-142. This study tested if it is feasible to use the system for the patient specific QA of the Volumetric Modulated Arc Therapy (VMAT). Methods: Water equivalent material (5cm) is attached to the front of the detector plate of the RavenQA for dosimetry purpose. Then the plate is attached to the gantry to synchronize the movement between the detector and the gantry. Since the detector moves together with gantry, The ’Reset gantry to 0’ function of the Eclipse planning system (Varian, CA) is used to simulate the measurement situation when calculating dose of the detector plate. The same gantry setup is used when delivering the treatment beam for feasibility test purposes. Cumulative dose is acquired for each arc. The optical scatter component of each captured image from the CCD camera is corrected by deconvolving the 2D spatial invariant optical scatter kernel (OSK). We assume that the OSK is a 2D isotropic point spread function with inverse-squared decrease as a function of radius from the center. Results: Three cases of VMAT plans including head & neck, whole pelvis and abdomen-pelvis are tested. Setup time for measurements was less than 5 minutes. Passing rates of absolute gamma were 99.3, 98.2, 95.9 respectively for 3%/3mm criteria and 96.2, 97.1, 86.4 for 2%/2mm criteria. The abdomen-pelvis field has long treatment fields, 37cm, which are longer than the detector plate (25cm). This plan showed relatively lower passing rate than other plans. Conclusion: An algorithm for IMRT/VMAT verification using the RavenQA has been developed and tested. The model of spatially invariant OSK works well for deconvolution purpose. It is proved that the RavenQA can be used for the patient specific verification of VMAT. This work is funded in part by a Maryland Industrial Partnership Program grant to University of Maryland and to JPLC who owns the

  9. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    Science.gov (United States)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  10. Nearest-Neighbor Estimation for ROC Analysis under Verification Bias.

    Science.gov (United States)

    Adimari, Gianfranco; Chiogna, Monica

    2015-05-01

    For a continuous-scale diagnostic test, the receiver operating characteristic (ROC) curve is a popular tool for displaying the ability of the test to discriminate between healthy and diseased subjects. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the test result and other characteristics of the subjects. Estimators of the ROC curve based only on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias, in particular under the assumption that the true disease status, if missing, is missing at random (MAR). MAR assumption means that the probability of missingness depends on the true disease status only through the test result and observed covariate information. However, the existing methods require parametric models for the (conditional) probability of disease and/or the (conditional) probability of verification, and hence are subject to model misspecification: a wrong specification of such parametric models can affect the behavior of the estimators, which can be inconsistent. To avoid misspecification problems, in this paper we propose a fully nonparametric method for the estimation of the ROC curve of a continuous test under verification bias. The method is based on nearest-neighbor imputation and adopts generic smooth regression models for both the probability that a subject is diseased and the probability that it is verified. Simulation experiments and an illustrative example show the usefulness of the new method. Variance estimation is also discussed.

  11. Bone Marrow Stromal Antigen 2 Is a Novel Plasma Biomarker and Prognosticator for Colorectal Carcinoma: A Secretome-Based Verification Study

    Directory of Open Access Journals (Sweden)

    Sum-Fu Chiang

    2015-01-01

    Full Text Available Background. The cancer cell secretome has been recognized as a valuable reservoir for identifying novel serum/plasma biomarkers for different cancers, including colorectal cancer (CRC. This study aimed to verify four CRC cell-secreted proteins (tumor-associated calcium signal transducer 2/trophoblast cell surface antigen 2 (TACSTD2/TROP2, tetraspanin-6 (TSPAN6, bone marrow stromal antigen 2 (BST2, and tumor necrosis factor receptor superfamily member 16 (NGFR as potential plasma CRC biomarkers. Methods. The study population comprises 152 CRC patients and 152 controls. Target protein levels in plasma and tissue samples were assessed by ELISA and immunohistochemistry, respectively. Results. Among the four candidate proteins examined by ELISA in a small sample set, only BST2 showed significantly elevated plasma levels in CRC patients versus controls. Immunohistochemical analysis revealed the overexpression of BST2 in CRC tissues, and higher BST2 expression levels correlated with poorer 5-year survival (46.47% versus 65.57%; p=0.044. Further verification confirmed the elevated plasma BST2 levels in CRC patients (2.35 ± 0.13 ng/mL versus controls (1.04 ± 0.03 ng/mL (p<0.01, with an area under the ROC curve (AUC being 0.858 comparable to that of CEA (0.867. Conclusion. BST2, a membrane protein selectively detected in CRC cell secretome, may be a novel plasma biomarker and prognosticator for CRC.

  12. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  13. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  14. Formal verification of mathematical software

    Science.gov (United States)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  15. Thoughts on Verification of Nuclear Disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, W H

    2007-09-26

    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was

  16. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    Science.gov (United States)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ULTRASONIC AQUEOUS CLEANING SYSTEMS, SMART SONIC CORPORATION, SMART SONIC

    Science.gov (United States)

    This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...

  18. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    can be combined and used for an efficient development and verification of new fail-safe systems. The expected result is a methodology for using domain-specific, formal languages, techniques and tools for more efficient development and verification of robust software for railway control systems......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...... signalling systems are going to be completely replaced with modern, computer based railway control systems based on the European standard ERTMS/ETCS [3, 4] by the Danish Signaling Programme [1]. The purpose of these systems is to control the railway traffic such that unsafe situations, like train collisions...

  19. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi

    2010-04-01

    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  20. Lucy Snowe : première réécriture de Jane Eyre Lucy Snowe: the First Rewriting of Jane Eyre

    Directory of Open Access Journals (Sweden)

    Elise Ouvrard

    2009-10-01

    Full Text Available While reading Villette, published by Charlotte Brontë in 1853, one cannot help thinking of Jane Eyre, the eponymous heroine of the novel published by the same author in 1847. The link means much more than the thematic and stylistic unity of a writer’s works. Lucy Snowe, the heroine of Villette, constitutes in fact the rewriting of Jane Eyre. Such rewriting is characterized by continuity as can be shown by the physical appearance, the strength of character and the progression of both heroines. However, there would be no interest in reproducing Jane Eyre identically and the outcome of the two novels presents a clear rupture, a rupture which has to be deciphered in order to understand the purpose of Charlotte Brontë in creating the filiation between Lucy Snowe and Jane Eyre.

  1. Vindicating Sycorax’s Anti-colonial Voice. An Overview of Some Postcolonial Re-Writings from The Tempest to Indigo

    Directory of Open Access Journals (Sweden)

    Xiana Vázquez Bouzó

    2016-10-01

    Full Text Available The aim of this essay is to place the Shakespearean character Sycorax as a symbol of anticolonia and anti-patriarchal resistance. Throughout the analysis of this figure in The Tempest and its re-writings, I suggest a change from the theories that turned Caliban into an antiimperial symbol towards a consideration of Sycorax for this role. I analyse the possibilities that this character opens in terms of re-writing, as well as the relation of the figure of the witch with her community. I also compare the ideas that Caliban personifies (including sexual violence, with those represented by Sycorax (the struggle against imperial and patriarchal forces. I ultimately defend that Sycorax fits better the position as a resistance symbol, since the struggles against masculine dominance must be addressed at the same level as those against imperialist oppressions.

  2. The use of an aSi-based EPID for routine absolute dosimetric pre-treatment verification of dynamic IMRT fields.

    Science.gov (United States)

    Van Esch, Ann; Depuydt, Tom; Huyskens, Dominique Pierre

    2004-05-01

    In parallel with the increased use of intensity modulated radiation treatment (IMRT) fields in radiation therapy, flat panel amorphous silicon (aSi) detectors are becoming the standard for online portal imaging at the linear accelerator. In order to minimise the workload related to the quality assurance of the IMRT fields, we have explored the possibility of using a commercially available aSi portal imager for absolute dosimetric verification of the delivery of dynamic IMRT fields. We investigated the basic dosimetric characteristics of an aSi portal imager (aS500, Varian Medical Systems), using an acquisition mode especially developed for portal dose (PD) integration during delivery of a-static or dynamic-radiation field. Secondly, the dose calculation algorithm of a commercially available treatment planning system (Cadplan, Varian Medical Systems) was modified to allow prediction of the PD image, i.e. to compare the intended fluence distribution with the fluence distribution as actually delivered by the dynamic multileaf collimator. Absolute rather than relative dose prediction was applied. The PD image prediction was compared to the corresponding acquisition for several clinical IMRT fields by means of the gamma evaluation method. The acquisition mode is accurate in integrating all PD over a wide range of monitor units, provided detector saturation is avoided. Although the dose deposition behaviour in the portal image detector is not equivalent to the dose to water measurements, it is reproducible and self-consistent, lending itself to quality assurance measurements. Gamma evaluations of the predicted versus measured PD distribution were within the pre-defined acceptance criteria for all clinical IMRT fields, i.e. allowing a dose difference of 3% of the local field dose in combination with a distance to agreement of 3 mm.

  3. Ultrasonic verification of composite structures

    NARCIS (Netherlands)

    Pelt, Maurice; de Boer, Robert Jan; Schoemaker, Christiaan; Sprik, Rudolf

    2014-01-01

    Ultrasonic Verification is a new method for the monitoring large surface areas of CFRP by ultrasound with few sensors. The echo response of a transmitted pulse through the structure is compared with the response of an earlier obtained reference signal to calculate a fidelity parameter.

  4. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples: a t...

  5. Private Verification for FPGA Bitstreams

    Science.gov (United States)

    2017-03-20

    security risks. Keywords: Trust, Privacy, Hardware Trojan, Hardware Security, ASIC, FPGA, Bitstream Introduction Many effective verification...devices but also to integrate PV-Bit, other Graf Research tools, and other commercial EDA tools into our overarching forward design trust flow philosophy

  6. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present...

  7. Guidance for the verification and validation of neural networks

    CERN Document Server

    Pullum, L; Darrah, M

    2007-01-01

    Guidance for the Verification and Validation of Neural Networks is a supplement to the IEEE Standard for Software Verification and Validation, IEEE Std 1012-1998. Born out of a need by the National Aeronautics and Space Administration's safety- and mission-critical research, this book compiles over five years of applied research and development efforts. It is intended to assist the performance of verification and validation (V&V) activities on adaptive software systems, with emphasis given to neural network systems. The book discusses some of the difficulties with trying to assure adaptive systems in general, presents techniques and advice for the V&V practitioner confronted with such a task, and based on a neural network case study, identifies specific tasking and recommendations for the V&V of neural network systems.

  8. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  9. Toward Automatic Verification of Goal-Oriented Flow Simulations

    Science.gov (United States)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  10. Inorganic scintillator detectors for real-time verification during brachytherapy

    Science.gov (United States)

    Kertzscher, G.; Beddar, S.

    2017-05-01

    Widespread use of real-time dose measurement technology to verify brachytherapy (BT) treatments is currently limited because only few detectors exhibit the large dynamic range and signal intensities that is required to accurately report the data. Inorganic scintillator detectors (ISDs) are promising for real-time BT verification because they can exhibit large signal intensities. Luminescence properties of ISDs based on ruby, Y2O3:Eu and CsI:Tl were compared with BCF-60 plastic scintillators to determine their potential for BT verification. Measurements revealed that ISDs can exhibit signal intensities 1800 times larger than BCF-60 and that the Čerenkov and fluorescence light contamination is negligible. The favourable luminescence properties of ISDs opens the possibility to manufacture simplified detector systems that can lead to more widespread real-time verification during BT treatment deliveries.

  11. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  12. Formal verification of a set of memory management units

    Science.gov (United States)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  13. Automatic quality verification of the TV sets

    Science.gov (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag

    2010-01-01

    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  14. Gender verification of female Olympic athletes.

    Science.gov (United States)

    Dickinson, Barry D; Genel, Myron; Robinowitz, Carolyn B; Turner, Patricia L; Woods, Gary L

    2002-10-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Problems include invalid screening tests, failure to understand the problems of intersex, the discriminatory singling out of women based only on laboratory results, and the stigmatization and emotional trauma experienced by individuals screened positive. Genuine sex-impostors have not been uncovered by laboratory-based genetic testing; however, gender verification procedures have resulted in substantial harm to a number of women athletes born with relatively rare genetic abnormalities. Individuals with sex-related genetic abnormalities raised as females have no unfair physical advantage and should not be excluded or stigmatized, including those with 5-alpha-steroid-reductase deficiency, partial or complete androgen insensitivity, and chromosomal mosaicism. In 1990, the International Amateur Athletics Federation (IAAF) called for ending genetic screening of female athletes and in 1992 adopted an approach designed to prevent only male impostors from competing. The IAAF recommended that the "medical delegate" have the ultimate authority in all medical matters, including the authority to arrange for the determination of the gender of the competitor if that approach is judged necessary. The new policy advocated by the IAAF, and conditionally adopted by the International Olympic Committee, protects the rights and privacy of athletes while safeguarding fairness of competition, and the American Medical Association recommends that it become the permanent approach.

  15. Clinical Implementation of a Model-Based In Vivo Dose Verification System for Stereotactic Body Radiation Therapy–Volumetric Modulated Arc Therapy Treatments Using the Electronic Portal Imaging Device

    Energy Technology Data Exchange (ETDEWEB)

    McCowan, Peter M., E-mail: pmccowan@cancercare.mb.ca [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Asuni, Ganiyu [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Van Uytven, Eric [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); VanBeek, Timothy [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); McCurdy, Boyd M.C. [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Loewen, Shaun K. [Department of Oncology, University of Calgary, Calgary, Alberta (Canada); Ahmed, Naseer; Bashir, Bashir; Butler, James B.; Chowdhury, Amitava; Dubey, Arbind; Leylek, Ahmet; Nashed, Maged [CancerCare Manitoba, Winnipeg, Manitoba (Canada)

    2017-04-01

    Purpose: To report findings from an in vivo dosimetry program implemented for all stereotactic body radiation therapy patients over a 31-month period and discuss the value and challenges of utilizing in vivo electronic portal imaging device (EPID) dosimetry clinically. Methods and Materials: From December 2013 to July 2016, 117 stereotactic body radiation therapy–volumetric modulated arc therapy patients (100 lung, 15 spine, and 2 liver) underwent 602 EPID-based in vivo dose verification events. A developed model-based dose reconstruction algorithm calculates the 3-dimensional dose distribution to the patient by back-projecting the primary fluence measured by the EPID during treatment. The EPID frame-averaging was optimized in June 2015. For each treatment, a 3%/3-mm γ comparison between our EPID-derived dose and the Eclipse AcurosXB–predicted dose to the planning target volume (PTV) and the ≥20% isodose volume were performed. Alert levels were defined as γ pass rates <85% (lung and liver) and <80% (spine). Investigations were carried out for all fractions exceeding the alert level and were classified as follows: EPID-related, algorithmic, patient setup, anatomic change, or unknown/unidentified errors. Results: The percentages of fractions exceeding the alert levels were 22.6% for lung before frame-average optimization and 8.0% for lung, 20.0% for spine, and 10.0% for liver after frame-average optimization. Overall, mean (± standard deviation) planning target volume γ pass rates were 90.7% ± 9.2%, 87.0% ± 9.3%, and 91.2% ± 3.4% for the lung, spine, and liver patients, respectively. Conclusions: Results from the clinical implementation of our model-based in vivo dose verification method using on-treatment EPID images is reported. The method is demonstrated to be valuable for routine clinical use for verifying delivered dose as well as for detecting errors.

  16. Towards formal verification of ToolBus scripts

    NARCIS (Netherlands)

    Fokkink, W.; Klint, P.; Lisser, B.; Usenko, Y.S.

    2008-01-01

    ToolBus allows one to connect tools via a software bus. Programming is done using the scripting language Tscript, which is based on the process algebra ACP. Tscript was originally designed to enable formal verification, but this option has so far not been explored in any detail. We present a method

  17. Optimal decision fusion for verification of face sequences

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, Raymond N.J.; Veldhuis, R.N.J.; Cronie, H.S.

    2007-01-01

    Face sequence contains more information of the user than a single face image. In this paper, optimal decision fusion is proposed to verify the face sequences, based on the original verification system for a single face image. We show by experiments that optimal decision fusion is a simple but

  18. Combinational Logic-Level Verification using Boolean Expression Diagrams

    DEFF Research Database (Denmark)

    Hulgaard, Henrik; Williams, Poul Frederick; Andersen, Henrik Reif

    1997-01-01

    of BDDs. This paper demonstrates that BEDs are well suited for solving the combinational logic-level verification problem which is, given two combinational circuits, to determine whether they implement the same Boolean functions. Based on all combinational circuits in the ISCAS 85 and LGSynth 91...

  19. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  20. Development of genomic tools for verification of hybrids and selfed ...

    African Journals Online (AJOL)

    It also proposes a rapid, high-throughput genomic DNA extraction protocol which is a modification of an existing extraction protocol adapted to the pace required for the DNA-based verification of often large population sizes. Three polymorphic simple sequence repeat (SSR) markers were selected from a total of 125 and ...

  1. Using Graph Transformations and Graph Abstractions for Software Verification

    NARCIS (Netherlands)

    Zambon, Eduardo; Ehrig, Hartmut; Rensink, Arend; Rozenberg, Grzegorz; Schurr, Andy

    In this abstract we present an overview of our intended approach for the verification of software written in imperative programming languages. This approach is based on model checking of graph transition systems (GTS), where each program state is modeled as a graph and the exploration engine is

  2. Using Graph Transformations and Graph Abstractions for Software Verification

    NARCIS (Netherlands)

    Corradini, Andrea; Zambon, Eduardo; Rensink, Arend

    In this paper we describe our intended approach for the verification of software written in imperative programming languages. We base our approach on model checking of graph transition systems, where each state is a graph and the transitions are specified by graph transformation rules. We believe

  3. On the validation of SPDM task verification facility

    NARCIS (Netherlands)

    Ma, Ou; Wang, Jiegao; Misra, Sarthak; Liu, Michael

    This paper describes a methodology for validating a ground-based, hardware-in-the-loop, space-robot simulation facility. This facility, called ‘‘SPDM task verification facility,’’ is being developed by the Canadian Space Agency for the purpose of verifying the contact dynamics performance of the

  4. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...

  5. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...

  6. BProVe: Tool support for business process verification

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    This demo introduces BProVe, a tool supporting automated verification of Business Process models. BProVe analysis is based on a formal operational semantics defined for the BPMN 2.0 modelling language, and is provided as a freely accessible service that uses open standard formats as input data...

  7. The Power of Love: Rewriting the Romance in Isabel Allende's The House of the Spirits and Eva Luna

    Directory of Open Access Journals (Sweden)

    Frances Jane P. Abao

    2000-12-01

    Full Text Available Despite its ongoing popularity with women readers, romantic fiction has traditionally been regarded as an instrument of women's oppression, largely due to its reinforcement and even glorification of sexual stereotypes and bourgeois values. Latin American writer Isabel Allende's novels The House of the Spirits and Eva Luna both contain a number of the elements and conventions of romantic fiction, including distinct similarities to the two acknowledged foundations of this genre: Charlotte Brontë's Jane Eyre and Emily Brontë's Wuthering Heights.However, The House of the Spirits and Eva Luna can also be read as rewritings of the genre of romantic fiction. In these two texts, Isabel Allende appropriates and then reworks certain conventions of romantic fiction in order to portray her notion of "fulfilling egalitarian relationships" between men and women. Nevertheless, despite these feminist revisions, Allende's rewritten romances do retain the "wish-fulfillment" element-or ideal-of romantic fiction, its depiction of women's fantasy of feminine values being appreciated and validated within heterosexual romantic relationships in the real world.

  8. Japanese Exploration to Solar System Small Bodies: Rewriting a Planetary Formation Theory with Astromaterial Connection (Invited)

    Science.gov (United States)

    Yano, H.

    2013-12-01

    Three decades ago, Japan's deep space exploration started with Sakigake and Suisei, twin flyby probes to P/Halley. Since then, the Solar System small bodies have been one of focused destinations to the Japanese solar system studies even today. Only one year after the Halley armada launch, the very first meeting was held for an asteroid sample return mission at ISAS, which after 25 years, materialized as the successful Earth return of Hayabusa , an engineering verification mission for sample return from surfaces of an NEO for the first time in the history. Launched in 2003 and returned in 2010, Hayabusa became the first to visit a sub-km, rubble-pile potentially hazardous asteroid in near Earth space. Its returned samples solved S-type asteroid - ordinary chondrite paradox by proving space weathering evidences in sub-micron scale. Between the Halley missions and Hayabusa, SOCCER concept by M-V rocket was jointly studied between ISAS and NASA; yet it was not realized due to insufficient delta-V for intact capture by decelerating flyby/encounter velocity to a cometary coma. The SOCCER later became reality as Stardust, NASA Discovery mission for cometary coma dust sample return in1999-2006. Japan has collected the second largest collection of the Antarctic meteorites and micrometeorites of the world and asteromaterial scientists are eager to collaborate with space missions. Also Japan enjoyed a long history of collaborations between professional astronomers and high-end amateur observers in the area of observational studies of asteroids, comets and meteors. Having these academic foundations, Japan has an emphasis on programmatic approach to sample returns of Solar System small bodies in future prospects. The immediate follow-on to Hayabusa is Hayabusa-2 mission to sample return with an artificial impactor from 1999 JU3, a C-type NEO in 2014-2020. Following successful demonstration of deep space solar sail technique by IKAROS in 2010-2013, the solar power sail is a deep

  9. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  10. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  11. Verification and validation of control system software

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.; Kisner, R.A. (Oak Ridge National Lab., TN (USA)); Bhadtt, S.C. (Electric Power Research Inst., Palo Alto, CA (USA))

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  12. Monitoring/Verification using DMS: TATP Example

    Energy Technology Data Exchange (ETDEWEB)

    Stephan Weeks, Kevin Kyle, Manuel Manard

    2008-05-30

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  13. Monitoring/Verification Using DMS: TATP Example

    Energy Technology Data Exchange (ETDEWEB)

    Kevin Kyle; Stephan Weeks

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  14. Formal verification of AI software

    Science.gov (United States)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  15. A study of applications scribe frame data verifications using design rule check

    Science.gov (United States)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  16. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  17. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  18. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  19. PROCEED and Crowd-sourced Formal Verification

    Science.gov (United States)

    2011-11-07

    VA November 7, 2011 PROCEED and Crowd-sourced Formal Verification Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...TITLE AND SUBTITLE PROCEED and Crowd-sourced Formal Verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d... Formal Verification (CSFV) Approved for Public Release, Distribution Unlimited. The Problem Application specific functions Are there fundamental

  20. Formal Verification of Mathematical Software. Volume 2

    Science.gov (United States)

    1990-05-01

    copy RAfJC-TR-90-53, Vol I (of twol Final Techrical Report ?"ay 1990 AD-A223 633 FORMAL VERIFICATION OF MATHEMATICAL SOFTWARE DTIC ELECTE Odyssey...copies of this report unless contractual obligations or notices on a specific document require that it be returned. FORMAL VERIFICATION OF...1 May 1986 Contract Expiration Date: 31 July 1989 Short Title of Work: Formal Verification of SDI Mathematical Software Period of Work Covered: May 86

  1. Automatic Verification of Autonomous Robot Missions

    Science.gov (United States)

    2014-01-01

    for a mission related to the search for a biohazard. Keywords: mobile robots, formal verification , performance guarantees, automatic translation 1...tested. 2 Related Work Formal verification of systems is critical when failure creates a high cost, such as life or death scenarios. A variety of...robot. 3.3 PARS Process algebras are specification languages that allow for formal verification of concurrent systems. Process Algebra for Robot

  2. Verification of the thermal insulation properties and determination the optimal position of the reflective thermal insulation layer in the wood based envelope

    National Research Council Canada - National Science Library

    Martin Labovský; Martin Lopušniak

    2016-01-01

    To achieve thinner wood based envelope is necessary look for an alternative thermal insulation material, which will have the best possible thermal insulation properties while maintaining affordability...

  3. Automatic Methods and Tools for the Verification of Real Time Systems

    National Research Council Canada - National Science Library

    Henzinger, T. A

    1997-01-01

    .... Symbolic verification methods are based either on deductive reasoning, using proof rules for symbolic logics, or on algorithmic analysis, using model checking procedures that operate on symbolic representations of state sets.

  4. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data......, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  5. Verification of the thermal insulation properties and determination the optimal position of the reflective thermal insulation layer in the wood based envelope

    Directory of Open Access Journals (Sweden)

    Labovský Martin

    2016-06-01

    Full Text Available To achieve thinner wood based envelope is necessary look for an alternative thermal insulation material, which will have the best possible thermal insulation properties while maintaining affordability. One such material is also reflective thermal insulation layer, but it is necessary to verify the thermal insulation properties and determine the optimal position in the wood based envelope.

  6. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  7. Token-Aware Completion Functions for Elastic Processor Verification

    Directory of Open Access Journals (Sweden)

    Sudarshan K. Srinivasan

    2009-01-01

    Full Text Available We develop a formal verification procedure to check that elastic pipelined processor designs correctly implement their instruction set architecture (ISA specifications. The notion of correctness we use is based on refinement. Refinement proofs are based on refinement maps, which—in the context of this problem—are functions that map elastic processor states to states of the ISA specification model. Data flow in elastic architectures is complicated by the insertion of any number of buffers in any place in the design, making it hard to construct refinement maps for elastic systems in a systematic manner. We introduce token-aware completion functions, which incorporate a mechanism to track the flow of data in elastic pipelines, as a highly automated and systematic approach to construct refinement maps. We demonstrate the efficiency of the overall verification procedure based on token-aware completion functions using six elastic pipelined processor models based on the DLX architecture.

  8. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  9. Turbulence Modeling Verification and Validation

    Science.gov (United States)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  10. Signature verification: A comprehensive study of the hidden signature method

    Directory of Open Access Journals (Sweden)

    Putz-Leszczyńska Joanna

    2015-09-01

    Full Text Available Many handwritten signature verification algorithms have been developed in order to distinguish between genuine signatures and forgeries. An important group of these methods is based on dynamic time warping (DTW. Traditional use of DTW for signature verification consists in forming a misalignment score between the verified signature and a set of template signatures. The right selection of template signatures has a big impact on that verification. In this article, we describe our proposition for replacing the template signatures with the hidden signature-an artificial signature which is created by minimizing the mean misalignment between itself and the signatures from the enrollment set. We present a few hidden signature estimation methods together with their comprehensive comparison. The hidden signature opens a number of new possibilities for signature analysis. We apply statistical properties of the hidden signature to normalize the error signal of the verified signature and to use the misalignment on the normalized errors as a verification basis. A result, we achieve satisfying error rates that allow creating an on-line system, ready for operating in a real-world environment

  11. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  12. The MODUS approach to formal verification

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert

    2014-01-01

    Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely...... in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...

  13. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  14. Verification of L-band SAR calibration

    Science.gov (United States)

    Larson, R. W.; Jackson, P. L.; Kasischke, E.

    1985-01-01

    Absolute calibration of a digital L-band SAR system to an accuracy of better than 3 dB has been verified. This was accomplished with a calibration signal generator that produces the phase history of a point target. This signal relates calibration values to various SAR data sets. Values of radar cross-section (RCS) of reference reflectors were obtained using a derived calibration relationship for the L-band channel on the ERIM/CCRS X-C-L SAR system. Calibrated RCS values were compared to known RCS values of each reference reflector for verification and to obtain an error estimate. The calibration was based on the radar response to 21 calibrated reference reflectors.

  15. A (re escrita de textos em livros didáticos de língua portuguesa = The rewriting of texts portuguese didactic books

    Directory of Open Access Journals (Sweden)

    Adair Vieira Gonçalves

    2013-02-01

    Full Text Available A partir do quadro epistemológico do interacionismo sociodiscursivo (ISD de Bronckart (2003, dos aportes teórico-metodológicos para o ensino de língua materna de Schneuwly e Dolz (2004 e em concepções de reescrita dialógica (GONÇALVES; BAZARIM, 2009, investigaremos como duas coleções de Livros Didáticos recomendadas pelo Ministério da Educação - MEC- abordam a reescrita de gêneros no Ensino Fundamental – ciclo II. Para sua efetivação, nos apoiaremos no folhado textual de Bronckart (2003, que contempla a infraestrutura textual, os componentes de ação, discursivos e linguístico-discursivos. A coleção Português – uma proposta de letramento – foca capacidades discursivas em detrimento de capacidades acionais e linguístico-discursivas, além de as propostas de escrita/reescrita não pertencerem ao mesmo gênero dentro de uma unidade temática. A coleção Linguagens no século XXI, ao contrário, focaliza as três capacidades de linguagem analisadas e, sobretudo, favorece a escrita/reescrita do mesmo gênero dentro da unidade do LD.Parting from the epistemological frame of Bronckart's (2003 sociodiscursive interactionism, on the theorethical-methodological supports for the teaching of mother tongue of Schneuwly and Dolz (2004 and the dialogical rewriting' conceptions (GONÇALVES; BAZARIM, 2009, we investigate how two collections of didactic books recommended by Ministry of Education and Culture-MEC- approach textual gender's rewriting in Primary Education cycle II. For its fulfillment, we will uphold on Bronckart's (2003 textual leafy, which contemplates textual infrastructure, discursive and linguistic-discursive components of action. The Português – uma proposta para o letramento collection - focus discursive capacities in detriment of actions and linguistic-discursive capabilities, besides the writing/rewriting's proposals don't belong to the same textual genre inside a thematic unity. The Linguagens no S

  16. Revisiting and Rewriting Early Career Encounters: Reconstructing One "Identity Defining" Moment

    Science.gov (United States)

    Yoo, Joanne

    2011-01-01

    There has been much research conducted into the effects of early career experiences on future practice. The research indicates that early career academics are particularly susceptible to burnout, as they are still developing their professional knowledge base, and are therefore more reliant on their theoretical knowledge or idealism to interpret…

  17. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  18. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  19. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  20. Verification of electricity savings through energy-efficient train management - Energy data base for traction units - Annex 5; Verifizierung der Stromeinsparung durch energieeffizientes Zugsmanagement - Anhang 5: Energiedatenbank Traktion

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, M.; Lerjen, M.; Menth, S. [emkamatik GmbH, Wettingen (Switzerland); Luethi, M. [Swiss Federal Insitute of Technology (ETHZ), Institute for Transport Planning and Systems (IVT), Zuerich (Switzerland); Tuchschmid, M. [SBB AG, BahnUmwelt-Center, 3000 Bern (Switzerland)

    2009-11-15

    This appendix to a final report for the Swiss Federal Office of Energy (SFOE) takes a look at how various data sources on the energy consumption of the SBB's traction units can be combined into an energy-data basis. In this way, the considerable amount of work previously involved in combining and correlating data can be avoided. The aims being followed in the realisation of the traction data base are examined and discussed. The data base will provide the basis for the manual detail analysis of energy consumption within the framework of the overall efforts to save electricity using efficient train management.