WorldWideScience

Sample records for base verification based

  1. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  2. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  3. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  4. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  5. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  6. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  7. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  8. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  9. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  10. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  11. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  12. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  13. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  14. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  15. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  16. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  17. A Scala DSL for RETE-Based Runtime Verification

    Science.gov (United States)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  18. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  19. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  20. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  1. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 £ 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  2. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  3. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  4. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  5. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  6. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    and the environment behaves. Synthesis of strategies in games can thus be used for automatic generation of correct-by-construction programs from specifications. We consider verification and synthesis problems for several well-known game-based models. This includes both model-checking problems and satisfiability...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm...... corresponds directly to a program for the corresponding entity of the system. A strategy for a player which ensures that the player wins no matter how the other players behave then corresponds to a program ensuring that the specification of the entity is satisfied no matter how the other entities...

  7. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  8. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    We present ParTypes, a type-based methodology for the verification of Message Passing Interface (MPI) programs written in the C programming language. The aim is to statically verify programs against protocol specifications, enforcing properties such as fidelity and absence of deadlocks. We develo...

  9. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  10. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  11. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  12. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  13. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  14. Feasibility of biochemical verification in a web-based smoking cessation study.

    Science.gov (United States)

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  16. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based......The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... on dictionary-based noise reduction and compare it to the baseline methods....

  17. Comparison of megavoltage position verification for prostate irradiation based on bony anatomy and implanted fiducials

    International Nuclear Information System (INIS)

    Nederveen, Aart J.; Dehnad, Homan; Heide, Uulke A. van der; Moorselaar, R. Jeroen A. van; Hofman, Pieter; Lagendijk, Jan J.W.

    2003-01-01

    Purpose: The patient position during radiotherapy treatment of prostate cancer can be verified with the help of portal images acquired during treatment. In this study we quantify the clinical consequences of the use of image-based verification based on the bony anatomy and the prostate target itself. Patients and methods: We analysed 2025 portal images and 23 computed tomography (CT) scans from 23 patients with prostate cancer. In all patients gold markers were implanted prior to CT scanning. Statistical data for both random and systematic errors were calculated for displacements of bones and markers and we investigated the effectiveness of an off-line correction protocol. Results: Standard deviations for systematic marker displacement are 2.4 mm in the lateral (LR) direction, 4.4 mm in the anterior-posterior (AP) direction and 3.7 mm in the caudal-cranial direction (CC). Application of off-line position verification based on the marker positions results in a shrinkage of the systematic error to well below 1 mm. Position verification based on the bony anatomy reduces the systematic target uncertainty to 50% in the AP direction and in the LR direction. No reduction was observed in the CC direction. For six out of 23 patients we found an increase of the systematic error after application of bony anatomy-based position verification. Conclusions: We show that even if correction based on the bony anatomy is applied, considerable margins have to be set to account for organ motion. Our study highlights that for individual patients the systematic error can increase after application of bony anatomy-based position verification, whereas the population standard deviation will decrease. Off-line target-based position verification effectively reduces the systematic error to well below 1 mm, thus enabling significant margin reduction

  18. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  19. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    Kwon, I. W.; Seong, P. H.

    1996-01-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  20. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  1. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  2. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  3. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  4. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  5. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  6. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    International Nuclear Information System (INIS)

    Visser, Ruurd; Wauben, David J.L.; Groot, Martijn de; Steenbakkers, Roel J.H.M.; Bijl, Henk P.; Godart, Jeremy; Veld, Aart A. van’t; Langendijk, Johannes A.; Korevaar, Erik W.

    2014-01-01

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for this. The aim of this study was to evaluate DVH-based treatment plan verification in addition to the GI evaluation method for head and neck IMRT. Materials and methods: Dose verifications of 700 subsequent head and neck cancer IMRT treatment plans were categorised according to gamma and DVH-based action levels. Fractionation dependent absolute dose limits were chosen. The results of the gamma- and DVH-based evaluations were compared to the decision of the medical physicist and/or radiation oncologist for plan acceptance. Results: Nearly all treatment plans (99.7%) were accepted for treatment according to the GI evaluation combined with DVH-based verification. Two treatment plans were re-planned according to DVH-based verification, which would have been accepted using the evaluation alone. DVH-based verification increased insight into dose delivery to patient specific structures increasing confidence that the treatment plans were clinically acceptable. Moreover, DVH-based action levels clearly distinguished the role of the medical physicist and radiation oncologist within the Quality Assurance (QA) procedure. Conclusions: DVH-based treatment plan verification complements the GI evaluation method improving head and neck IMRT-QA

  7. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Directory of Open Access Journals (Sweden)

    Jingzhen Li

    2017-01-01

    Full Text Available In this paper, an approach to biometric verification based on human body communication (HBC is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA. Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR and false rejection rate (FRR based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN classification, support vector machines (SVM, and naive Bayesian method (NBM classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  8. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  9. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  10. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  11. Content-based Image Hiding Method for Secure Network Biometric Verification

    Directory of Open Access Journals (Sweden)

    Xiangjiu Che

    2011-08-01

    Full Text Available For secure biometric verification, most existing methods embed biometric information directly into the cover image, but content correlation analysis between the biometric image and the cover image is often ignored. In this paper, we propose a novel biometric image hiding approach based on the content correlation analysis to protect the network-based transmitted image. By using principal component analysis (PCA, the content correlation between the biometric image and the cover image is firstly analyzed. Then based on particle swarm optimization (PSO algorithm, some regions of the cover image are selected to represent the biometric image, in which the cover image can carry partial content of the biometric image. As a result of the correlation analysis, the unrepresented part of the biometric image is embedded into the cover image by using the discrete wavelet transform (DWT. Combined with human visual system (HVS model, this approach makes the hiding result perceptually invisible. The extensive experimental results demonstrate that the proposed hiding approach is robust against some common frequency and geometric attacks; it also provides an effective protection for the secure biometric verification.

  12. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  13. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state-of-the-art Horn clause verification tools....

  14. Knowledge-based verification of clinical guidelines by detection of anomalies.

    Science.gov (United States)

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  15. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  16. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  17. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  18. Novel Verification Method for Timing Optimization Based on DPSO

    Directory of Open Access Journals (Sweden)

    Chuandong Chen

    2018-01-01

    Full Text Available Timing optimization for logic circuits is one of the key steps in logic synthesis. Extant research data are mainly proposed based on various intelligence algorithms. Hence, they are neither comparable with timing optimization data collected by the mainstream electronic design automation (EDA tool nor able to verify the superiority of intelligence algorithms to the EDA tool in terms of optimization ability. To address these shortcomings, a novel verification method is proposed in this study. First, a discrete particle swarm optimization (DPSO algorithm was applied to optimize the timing of the mixed polarity Reed-Muller (MPRM logic circuit. Second, the Design Compiler (DC algorithm was used to optimize the timing of the same MPRM logic circuit through special settings and constraints. Finally, the timing optimization results of the two algorithms were compared based on MCNC benchmark circuits. The timing optimization results obtained using DPSO are compared with those obtained from DC, and DPSO demonstrates an average reduction of 9.7% in the timing delays of critical paths for a number of MCNC benchmark circuits. The proposed verification method directly ascertains whether the intelligence algorithm has a better timing optimization ability than DC.

  19. E-Visas Verification Schemes Based on Public-Key Infrastructure and Identity Based Encryption

    OpenAIRE

    Najlaa A. Abuadhmah; Muawya Naser; Azman Samsudin

    2010-01-01

    Problem statement: Visa is a very important travelling document, which is an essential need at the point of entry of any country we are visiting. However an important document such as visa is still handled manually which affects the accuracy and efficiency of processing the visa. Work on e-visa is almost unexplored. Approach: This study provided a detailed description of a newly proposed e-visa verification system prototyped based on RFID technology. The core technology of the proposed e-visa...

  20. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  1. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  2. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  3. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  4. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H [University Medical Center Mannheim, University of Heidelberg, Mannheim, Baden-Wuerttemberg (Germany)

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  5. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    International Nuclear Information System (INIS)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-01-01

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  6. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  7. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  8. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  9. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  10. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  11. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  12. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  13. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval......, and the TCL method aims to exploit this temporal structure. More specifically, it trains deep neural networks (DNNs) to discriminate temporal events obtained by uniformly segmenting speech signals, in contrast to existing DNN based BN feature extraction methods that train DNNs using labeled data...... to discriminate speakers or pass-phrases or phones or a combination of them. In the context of speaker verification, speech data of fixed pass-phrases are used for TCL-BN training, while the pass-phrases used for TCL-BN training are excluded from being used for SV, so that the learned features can be considered...

  14. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  15. Convex polyhedral abstractions, specialisation and property-based predicate splitting in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    We present an approach to constrained Horn clause (CHC) verification combining three techniques: abstract interpretation over a domain of convex polyhedra, specialisation of the constraints in CHCs using abstract interpretation of query-answer transformed clauses, and refinement by splitting...... in conjunction with specialisation for propagating constraints it can frequently solve challenging verification problems. This is a contribution in itself, but refinement is needed when it fails, and the question of how to refine convex polyhedral analyses has not been studied much. We present a refinement...... technique based on interpolants derived from a counterexample trace; these are used to drive a property-based specialisation that splits predicates, leading in turn to more precise convex polyhedral analyses. The process of specialisation, analysis and splitting can be repeated, in a manner similar...

  16. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...... type annotations or explicit type casts. We have implemented a protocol verifier SpiCa based on the algorithm, and confirmed its effectiveness....

  17. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  18. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  19. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  20. Simulation based mask defect repair verification and disposition

    Science.gov (United States)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  1. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  2. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  3. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  4. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology, University of Washington, 1959 N. E. Pacific Street, Seattle, Washington 98195 (United States)

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  5. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    Science.gov (United States)

    2012-04-01

    AGARD doivent comporter la dénomination « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues, telles que le titre ...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des

  6. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  7. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  8. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  9. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    OpenAIRE

    Wei, Songjie; Li, Shuai; Liu, Peilong; Liu, Meilin

    2018-01-01

    LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively...

  10. Comparison of carina-based versus bony anatomy-based registration for setup verification in esophageal cancer radiotherapy.

    Science.gov (United States)

    Machiels, Mélanie; Jin, Peng; van Gurp, Christianne H; van Hooft, Jeanin E; Alderliesten, Tanja; Hulshof, Maarten C C M

    2018-03-21

    To investigate the feasibility and geometric accuracy of carina-based registration for CBCT-guided setup verification in esophageal cancer IGRT, compared with current practice bony anatomy-based registration. Included were 24 esophageal cancer patients with 65 implanted fiducial markers, visible on planning CTs and follow-up CBCTs. All available CBCT scans (n = 236) were rigidly registered to the planning CT with respect to the bony anatomy and the carina. Target coverage was visually inspected and marker position variation was quantified relative to both registration approaches; the variation of systematic (Σ) and random errors (σ) was estimated. Automatic carina-based registration was feasible in 94.9% of the CBCT scans, with an adequate target coverage in 91.1% compared to 100% after bony anatomy-based registration. Overall, Σ (σ) in the LR/CC/AP direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm using the bony anatomy registration compared to 3.3(3.0)/3.6(2.6)/3.9(3.1) mm for the carina. Mid-thoracic placed markers showed a non-significant but smaller Σ in CC and AP direction when using the carina-based registration. Compared with a bony anatomy-based registration, carina-based registration for esophageal cancer IGRT results in inadequate target coverage in 8.9% of cases. Furthermore, large Σ and σ, requiring larger anisotropic margins, were seen after carina-based registration. Only for tumors entirely confined to the mid-thoracic region the carina-based registration might be slightly favorable.

  11. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  12. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  13. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  14. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  15. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  16. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  17. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    International Nuclear Information System (INIS)

    Tachibana, H; Tachibana, R

    2015-01-01

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction

  18. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  19. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  20. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  1. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  2. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  3. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  4. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  5. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  6. Quality verification at Arkansas Nuclear One using performance-based concepts

    International Nuclear Information System (INIS)

    Cooper, R.M.

    1990-01-01

    Performance-based auditing is beginning to make an impact within the nuclear industry. Its use provides performance assessments of the operating plant. In the past, this company along with most other nuclear utilities, performed compliance-based audits. These audits focused on paper reviews of past activities that were completed in weeks or months. This type of audit did not provide a comprehensive assessment of the effectiveness of an activity's performance, nor was it able to identify any performance problems that may have occurred. To respond to this discrepancy, a comprehensive overhaul of quality assurance (QA) assessment programs was developed. The first major change was to develop a technical specification (tech spec) audit program, with the objective of auditing each tech spec line item every 5 yr. To achieve performance-based results within the tech spec audit program, a tech spec surveillance program was implemented whose goal is to observe 75% of the tech-spec required tests every 5 yr. The next major change was to develop a QA surveillance program that would provide surveillance coverage for the remainder of the plant not covered by the tech spec surveillance program. One other improvement was to merge the QA/quality control (QC) functions into one nuclear quality group. The final part of the quality verification effort is trending of the quality performance-based data (including US Nuclear Regulatory Commission (NRC) violations)

  7. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  8. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  9. Verification and Planning Based on Coinductive Logic Programming

    Science.gov (United States)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  10. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  11. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    Science.gov (United States)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  12. Design Verification Enhancement of FPGA-based Plant Protection System Trip Logics for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jae Cheon; Heo, Gyun Young

    2016-01-01

    As part of strengthening the application of FPGA technology and find solution to its challenges in NPPs, international atomic energy agency (IAEA) has indicated interest by joining sponsorship of Topical Group on FPGA Applications in NPPs (TG-FAN) that hold meetings up to 7th times until now, in form of workshop (International workshop on the application of FPGAs in NPPs) annually since 2008. The workshops attracted a significant interest and had a broad representation of stakeholders such as regulators, utilities, research organizations, system designers, and vendors, from various countries that converge to discuss the current issues regarding instrumentation and control (I and C) systems as well as FPGA applications. Two out of many technical issues identified by the group are lifecycle of FPGA-based platforms, systems, and applications; and methods and tools for V and V. Therefore, in this work, several design steps that involved the use of model-based systems engineering process as well as MATLAB/SIMULINK model which lead to the enhancement of design verification are employed. The verified and validated design output works correctly and effectively. Conclusively, the model-based systems engineering approach and the structural step-by-step design modeling techniques including SIMULINK model utilized in this work have shown how FPGA PPS trip logics design verification can be enhanced. If these design approaches are employ in the design of FPGA-based I and C systems, the design can be easily verified and validated

  13. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    Science.gov (United States)

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  14. On marker-based parentage verification via non-linear optimization.

    Science.gov (United States)

    Boerner, Vinzent

    2017-06-15

    Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement

  15. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    International Nuclear Information System (INIS)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  16. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States); Watkins, W

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  17. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  18. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  19. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  20. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    Directory of Open Access Journals (Sweden)

    Grzegorz Zwierzchowski

    2016-08-01

    Full Text Available Purpose: Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm, accurate tissues segmentation, and the structure’s elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods : Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results : Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm. Conclusions : Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  1. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  2. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  3. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  4. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  5. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-04-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes. Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance. Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies. Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software. Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance. Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At

  6. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  7. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivation...... compare the results with other state of the art Horn clause verification tools....

  8. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  9. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    Directory of Open Access Journals (Sweden)

    Songjie Wei

    2018-01-01

    Full Text Available LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively, are investigated, which further enhance the authentication reliability and efficiency in LEO constellation. Experiments on OPNET simulation platform evaluate and demonstrate the effectiveness, reliability, and fast-switching efficiency of the proposed protocol. For LEO networks, BAVP surpasses the well-known existing solutions with significant advantages in both performance and scalability which are supported by theoretical analysis and simulation results.

  10. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  11. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    Science.gov (United States)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  12. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  13. Verification test for on-line diagnosis algorithm based on noise analysis

    International Nuclear Information System (INIS)

    Tamaoki, T.; Naito, N.; Tsunoda, T.; Sato, M.; Kameda, A.

    1980-01-01

    An on-line diagnosis algorithm was developed and its verification test was performed using a minicomputer. This algorithm identifies the plant state by analyzing various system noise patterns, such as power spectral densities, coherence functions etc., in three procedure steps. Each obtained noise pattern is examined by using the distances from its reference patterns prepared for various plant states. Then, the plant state is identified by synthesizing each result with an evaluation weight. This weight is determined automatically from the reference noise patterns prior to on-line diagnosis. The test was performed with 50 MW (th) Steam Generator noise data recorded under various controller parameter values. The algorithm performance was evaluated based on a newly devised index. The results obtained with one kind of weight showed the algorithm efficiency under the proper selection of noise patterns. Results for another kind of weight showed the robustness of the algorithm to this selection. (orig.)

  14. A novel method for sub-arc VMAT dose delivery verification based on portal dosimetry with an EPID.

    Science.gov (United States)

    Cools, Ruud A M; Dirkx, Maarten L P; Heijmen, Ben J M

    2017-11-01

    The EPID-based sub-arc verification of VMAT dose delivery requires synchronization of the acquired electronic portal images (EPIs) with the VMAT delivery, that is, establishment of the start- and stop-MU of the acquired images. To realize this, published synchronization methods propose the use of logging features of the linac or dedicated hardware solutions. In this study, we developed a novel, software-based synchronization method that only uses information inherently available in the acquired images. The EPIs are continuously acquired during pretreatment VMAT delivery and converted into Portal Dose Images (PDIs). Sub-arcs of approximately 10 MU are then defined by combining groups of sequentially acquired PDIs. The start- and stop-MUs of measured sub-arcs are established in a synchronization procedure, using only dosimetric information in measured and predicted PDIs. Sub-arc verification of a VMAT dose delivery is based on comparison of measured sub-arc PDIs with synchronized, predicted sub-arc PDIs, using γ-analyses. To assess the accuracy of this new method, measured and predicted PDIs were compared for 20 clinically applied VMAT prostate cancer plans. The sensitivity of the method for detection of delivery errors was investigated using VMAT deliveries with intentionally inserted, small perturbations (25 error scenarios; leaf gap deviations ≤ 1.5 mm, leaf motion stops during ≤ 15 MU, linac output error ≤ 2%). For the 20 plans, the average failed pixel rates (FPR) for full-arc and sub-arc dose QA were 0.36% ± 0.26% (1 SD) and 0.64% ± 0.88%, based on 2%/2 mm and 3%/3 mm γ-analyses, respectively. Small systematic perturbations of up to 1% output error and 1 mm leaf offset were detected using full-arc QA. Sub-arc QA was able to detect positioning errors in three leaves only during approximately 20 MU and small dose delivery errors during approximately 40 MU. In an ROC analysis, the area under the curve (AUC) for the combined full-arc/sub-arc approach was

  15. Validation Of Critical Knowledge-Based Systems

    Science.gov (United States)

    Duke, Eugene L.

    1992-01-01

    Report discusses approach to verification and validation of knowledge-based systems. Also known as "expert systems". Concerned mainly with development of methodologies for verification of knowledge-based systems critical to flight-research systems; e.g., fault-tolerant control systems for advanced aircraft. Subject matter also has relevance to knowledge-based systems controlling medical life-support equipment or commuter railroad systems.

  16. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  17. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  18. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    International Nuclear Information System (INIS)

    Magazzù, G; Borgese, G; Costantino, N; Fanucci, L; Saponara, S; Incandela, J

    2013-01-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  19. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  20. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  1. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    OpenAIRE

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  2. Graph-based specification and verification for aspect-oriented languages

    NARCIS (Netherlands)

    Staijen, T.

    2010-01-01

    Aspect-oriented software development aims at improving separation of concerns at all levels in the software development life-cycle, from architecture to code implementation. In this thesis we strive to develop verification methods specifically for aspect-oriented programming languages. For this

  3. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, R; Kamima, T [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dose deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  4. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    International Nuclear Information System (INIS)

    Lips, Irene M; Dehnad, Homan; Gils, Carla H van; Boeken Kruger, Arto E; Heide, Uulke A van der; Vulpen, Marco van

    2008-01-01

    We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT) with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment) and weekly during treatment (acute toxicity) were scored using the Common Toxicity Criteria (CTC). The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC) scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU) complaints and 2% experienced grade 2 gastrointestinal (GI) complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4). In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used

  5. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    Directory of Open Access Journals (Sweden)

    Boeken Kruger Arto E

    2008-05-01

    Full Text Available Abstract We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment and weekly during treatment (acute toxicity were scored using the Common Toxicity Criteria (CTC. The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU complaints and 2% experienced grade 2 gastrointestinal (GI complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4. In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used.

  6. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  7. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Christopher

    2014-06-12

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β{sup +}-emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  8. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    International Nuclear Information System (INIS)

    Kurz, Christopher

    2014-01-01

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β + -emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  9. SU-C-207A-04: Accuracy of Acoustic-Based Proton Range Verification in Water

    International Nuclear Information System (INIS)

    Jones, KC; Sehgal, CM; Avery, S; Vander Stappen, F

    2016-01-01

    Purpose: To determine the accuracy and dose required for acoustic-based proton range verification (protoacoustics) in water. Methods: Proton pulses with 17 µs FWHM and instantaneous currents of 480 nA (5.6 × 10 7 protons/pulse, 8.9 cGy/pulse) were generated by a clinical, hospital-based cyclotron at the University of Pennsylvania. The protoacoustic signal generated in a water phantom by the 190 MeV proton pulses was measured with a hydrophone placed at multiple known positions surrounding the dose deposition. The background random noise was measured. The protoacoustic signal was simulated to compare to the experiments. Results: The maximum protoacoustic signal amplitude at 5 cm distance was 5.2 mPa per 1 × 10 7 protons (1.6 cGy at the Bragg peak). The background random noise of the measurement was 27 mPa. Comparison between simulation and experiment indicates that the hydrophone introduced a delay of 2.4 µs. For acoustic data collected with a signal-to-noise ratio (SNR) of 21, deconvolution of the protoacoustic signal with the proton pulse provided the most precise time-of-flight range measurement (standard deviation of 2.0 mm), but a systematic error (−4.5 mm) was observed. Conclusion: Based on water phantom measurements at a clinical hospital-based cyclotron, protoacoustics is a potential technique for measuring the proton Bragg peak range with 2.0 mm standard deviation. Simultaneous use of multiple detectors is expected to reduce the standard deviation, but calibration is required to remove systematic error. Based on the measured background noise and protoacoustic amplitude, a SNR of 5.3 is projected for a deposited dose of 2 Gy.

  10. WE-DE-201-11: Sensitivity and Specificity of Verification Methods Based On Total Reference Air Kerma (TRAK) Or On User Provided Dose Points for Graphically Planned Skin HDR Brachytherapy

    International Nuclear Information System (INIS)

    Damato, A; Devlin, P; Bhagwat, M; Buzurovic, I; Hansen, J; O’Farrell, D; Cormack, R

    2016-01-01

    Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of the clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.

  11. WE-DE-201-11: Sensitivity and Specificity of Verification Methods Based On Total Reference Air Kerma (TRAK) Or On User Provided Dose Points for Graphically Planned Skin HDR Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Damato, A; Devlin, P; Bhagwat, M; Buzurovic, I; Hansen, J; O’Farrell, D; Cormack, R [Harvard Medical School, Boston, MA (United States)

    2016-06-15

    Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of the clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.

  12. DEVELOPMENT OF ENRICHMENT VERIFICATION ASSAY BASED ON THE AGE AND 235U AND 238U ACTIVITIES OF THE SAMPLES

    International Nuclear Information System (INIS)

    AL-YAMAHI, H.; EL-MONGY, S.A.

    2008-01-01

    Development of the enrichment verification methods is the backbone of the nuclear materials safeguards skeleton. In this study, the 235U percentage of depleted , natural and very slightly enriched uranium samples were estimated based on the sample age and the measured activity of 235U and 238U. The HpGe and NaI spectrometry were used for samples assay. A developed equation was derived to correlate the sample age and 235U and 238U activities with the enrichment percentage (E%). The results of the calculated E% by the deduced equation and the target E% values were found to be similar and within 0.58 -1.75% bias in the case of HpGe measurements. The correlation between them was found to be very sharp. The activity was also calculated based on the measured sample count rate and the efficiency at the gamma energies of interest. The correlation between the E% and the 235U activity was estimated and found to be linearly sharp. The results obtained by NaI was found to be less accurate than these obtained by HpGe. The bias in the case of NaI assay was in the range from 6.398% to 22.8% for E% verification

  13. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    Science.gov (United States)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  14. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    International Nuclear Information System (INIS)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P

    2015-01-01

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm 3 cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments

  15. Accelerating SystemVerilog UVM Based VIP to Improve Methodology for Verification of Image Signal Processing Designs Using HW Emulator

    OpenAIRE

    Jain, Abhishek; Gupta, Piyush Kumar; Gupta, Dr. Hima; Dhar, Sachish

    2014-01-01

    In this paper we present the development of Acceleratable UVCs from standard UVCs in SystemVerilog and their usage in UVM based Verification Environment of Image Signal Processing designs to increase run time performance. This paper covers development of Acceleratable UVCs from standard UVCs for internal control and data buses of ST imaging group by partitioning of transaction-level components and cycle-accurate signal-level components between the software simulator and hardware accelerator r...

  16. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  17. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  18. Simple thermal to thermal face verification method based on local texture descriptors

    Science.gov (United States)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  19. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  20. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  1. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  2. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  3. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  4. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  5. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  6. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  7. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  8. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  9. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  10. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  11. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  12. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  13. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  14. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  15. Component-Based Development of Runtime Observers in the COMDES Framework

    DEFF Research Database (Denmark)

    Guan, Wei; Li, Gang; Angelov, Christo K.

    2013-01-01

    against formally specified properties. This paper presents a component-based design method for runtime observers in the context of COMDES framework—a component-based framework for distributed embedded system and its supporting tools. Therefore, runtime verification is facilitated by model......Formal verification methods, such as exhaustive model checking, are often infeasible because of high computational complexity. Runtime observers (monitors) provide an alternative, light-weight verification method, which offers a non-exhaustive but still feasible approach to monitor system behavior...

  16. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  17. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  18. SAT-Based Software Certification

    National Research Council Canada - National Science Library

    Chaki, Sagar

    2006-01-01

    ... predicate abstraction and validated by generating and proving verification conditions. In addition, the first part of the report proposes the use of theorem provers based on Boolean propositional satisfiability (SAT...

  19. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  20. Biometrics based authentication scheme for session initiation protocol

    OpenAIRE

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when co...

  1. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  2. Grip-Pattern Verification for Smart Gun Based on Maximum-Pairwise Comparison and Mean-Template Comparison

    NARCIS (Netherlands)

    Shang, X.; Veldhuis, Raymond N.J.

    2008-01-01

    In our biometric verification system of a smart gun, the rightful user of a gun is authenticated by grip-pattern recognition. In this work verification will be done using two types of comparison methods, respectively. One is mean-template comparison, where the matching score between a test image and

  3. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  4. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  5. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  6. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  7. Biometrics based authentication scheme for session initiation protocol.

    Science.gov (United States)

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  8. A new approach to hand-based authentication

    Science.gov (United States)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  9. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  10. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  11. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min, E-mail: jewellee@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Hyun Chul, E-mail: leehc@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Ha, Jun Su, E-mail: junsu.ha@kustar.ac.ae [Department of Nuclear Engineering, Khalifa University of Science Technology and Research, Abu Dhabi P.O. Box 127788 (United Arab Emirates); Seong, Poong Hyun, E-mail: phseong@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2016-10-15

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  12. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Lee, Hyun Chul; Ha, Jun Su; Seong, Poong Hyun

    2016-01-01

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  13. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  14. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  15. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  16. Geothermal-resource verification for Air Force bases

    Energy Technology Data Exchange (ETDEWEB)

    Grant, P.R. Jr.

    1981-06-01

    This report summarizes the various types of geothermal energy reviews some legal uncertainties of the resource and then describes a methodology to evaluate geothermal resources for applications to US Air Force bases. Estimates suggest that exploration costs will be $50,000 to $300,000, which, if favorable, would lead to drilling a $500,000 exploration well. Successful identification and development of a geothermal resource could provide all base, fixed system needs with an inexpensive, renewable energy source.

  17. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  18. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  19. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  20. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    Science.gov (United States)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  1. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  2. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  3. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  4. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  5. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  6. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  7. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  8. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  9. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  10. [Verification of Learning Effects by Team-based Learning].

    Science.gov (United States)

    Ono, Shin-Ichi; Ito, Yoshihisa; Ishige, Kumiko; Inokuchi, Norio; Kosuge, Yasuhiro; Asami, Satoru; Izumisawa, Megumi; Kobayashi, Hiroko; Hayashi, Hiroyuki; Suzuki, Takashi; Kishikawa, Yukinaga; Hata, Harumi; Kose, Eiji; Tabata, Kei-Ichi

    2017-11-01

     It has been recommended that active learning methods, such as team-based learning (TBL) and problem-based learning (PBL), be introduced into university classes by the Central Council for Education. As such, for the past 3 years, we have implemented TBL in a medical therapeutics course for 4-year students. Based upon our experience, TBL is characterized as follows: TBL needs fewer teachers than PBL to conduct a TBL module. TBL enables both students and teachers to recognize and confirm the learning results from preparation and reviewing. TBL grows students' responsibility for themselves and their teams, and likely facilitates learning activities through peer assessment.

  11. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  12. SU-F-T-463: Light-Field Based Dynalog Verification

    International Nuclear Information System (INIS)

    Atwal, P; Ramaseshan, R

    2016-01-01

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  13. SU-F-T-463: Light-Field Based Dynalog Verification

    Energy Technology Data Exchange (ETDEWEB)

    Atwal, P; Ramaseshan, R [BC Cancer Agency, Abbotsford, BC (Canada)

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  14. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  15. SoC Design Approach Using Convertibility Verification

    Directory of Open Access Journals (Sweden)

    Basu Samik

    2008-01-01

    Full Text Available Abstract Compositional design of systems on chip from preverified components helps to achieve shorter design cycles and time to market. However, the design process is affected by the issue of protocol mismatches, where two components fail to communicate with each other due to protocol differences. Convertibility verification, which involves the automatic generation of a converter to facilitate communication between two mismatched components, is a collection of techniques to address protocol mismatches. We present an approach to convertibility verification using module checking. We use Kripke structures to represent protocols and the temporal logic to describe desired system behavior. A tableau-based converter generation algorithm is presented which is shown to be sound and complete. We have developed a prototype implementation of the proposed algorithm and have used it to verify that it can handle many classical protocol mismatch problems along with SoC problems. The initial idea for -based convertibility verification was presented at SLA++P '07 as presented in the work by Roopak Sinha et al. 2008.

  16. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  17. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  18. Token-Aware Completion Functions for Elastic Processor Verification

    Directory of Open Access Journals (Sweden)

    Sudarshan K. Srinivasan

    2009-01-01

    Full Text Available We develop a formal verification procedure to check that elastic pipelined processor designs correctly implement their instruction set architecture (ISA specifications. The notion of correctness we use is based on refinement. Refinement proofs are based on refinement maps, which—in the context of this problem—are functions that map elastic processor states to states of the ISA specification model. Data flow in elastic architectures is complicated by the insertion of any number of buffers in any place in the design, making it hard to construct refinement maps for elastic systems in a systematic manner. We introduce token-aware completion functions, which incorporate a mechanism to track the flow of data in elastic pipelines, as a highly automated and systematic approach to construct refinement maps. We demonstrate the efficiency of the overall verification procedure based on token-aware completion functions using six elastic pipelined processor models based on the DLX architecture.

  19. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  20. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  1. Verification of Liveness Properties on Closed Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Andersen, Mathias; Larsen, Heine G.; Srba, Jiri

    2012-01-01

    Verification of closed timed models by explicit state-space exploration methods is an alternative to the wide-spread symbolic techniques based on difference bound matrices (DBMs). A few experiments found in the literature confirm that for the reachability analysis of timed automata explicit...... techniques can compete with DBM-based algorithms, at least for situations where the constants used in the models are relatively small. To the best of our knowledge, the explicit methods have not yet been employed in the verification of liveness properties in Petri net models extended with time. We present...

  2. Solar energy prediction and verification using operational model forecasts and ground-based solar measurements

    International Nuclear Information System (INIS)

    Kosmopoulos, P.G.; Kazadzis, S.; Lagouvardos, K.; Kotroni, V.; Bais, A.

    2015-01-01

    The present study focuses on the predictions and verification of these predictions of solar energy using ground-based solar measurements from the Hellenic Network for Solar Energy and the National Observatory of Athens network, as well as solar radiation operational forecasts provided by the MM5 mesoscale model. The evaluation was carried out independently for the different networks, for two forecast horizons (1 and 2 days ahead), for the seasons of the year, for varying solar elevation, for the indicative energy potential of the area, and for four classes of cloud cover based on the calculated clearness index (k_t): CS (clear sky), SC (scattered clouds), BC (broken clouds) and OC (overcast). The seasonal dependence presented relative rRMSE (Root Mean Square Error) values ranging from 15% (summer) to 60% (winter), while the solar elevation dependence revealed a high effectiveness and reliability near local noon (rRMSE ∼30%). An increment of the errors with cloudiness was also observed. For CS with mean GHI (global horizontal irradiance) ∼ 650 W/m"2 the errors are 8%, for SC 20% and for BC and OC the errors were greater (>40%) but correspond to much lower radiation levels (<120 W/m"2) of consequently lower energy potential impact. The total energy potential for each ground station ranges from 1.5 to 1.9 MWh/m"2, while the mean monthly forecast error was found to be consistently below 10%. - Highlights: • Long term measurements at different atmospheric cases are needed for energy forecasting model evaluations. • The total energy potential at the Greek sites presented ranges from 1.5 to 1.9 MWh/m"2. • Mean monthly energy forecast errors are within 10% for all cases analyzed. • Cloud presence results of an additional forecast error that varies with the cloud cover.

  3. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    Science.gov (United States)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  4. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  5. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  6. Property-Based Monitoring of Analog and Mixed-Signal Systems

    Science.gov (United States)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  7. HUMTRN: documentation and verification for an ICRP-based age- and sex-specific human simulation model for radionuclide dose assessment

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Wenzel, W.J.

    1984-06-01

    The dynamic human simulation model HUMTRN is designed specifically as a major module of BIOTRAN to integrate climatic, hydrologic, atmospheric, food crop, and herbivore simulation with human dietary and physiological characteristics, and metabolism and radionuclides to predict radiation doses to selected organs of both sexes in different age groups. The model is based on age- and weight-specific equations developed for predicting human radionuclide transport from metabolic and physical characteristics. These characteristics are modeled from studies documented by the International Commission on Radiological Protection (ICRP 23). HUMTRN allows cumulative doses from uranium or plutonium radionuclides to be predicted by modeling age-specific anatomical, physiological, and metabolic properties of individuals between 1 and 70 years of age and can track radiation exposure and radionuclide metabolism for any age group for specified daily or yearly time periods. The simulated daily dose integration of eight or more simultaneous air, water, and food intakes gives a new, comprehensive, dynamic picture of radionuclide intake, uptake, and hazard analysis for complex scenarios. A detailed example using site-specific data based on the Pantex studies is included for verification. 14 references, 24 figures, 10 tables

  8. CLSI-based transference and verification of CALIPER pediatric reference intervals for 29 Ortho VITROS 5600 chemistry assays.

    Science.gov (United States)

    Higgins, Victoria; Truong, Dorothy; Woroch, Amy; Chan, Man Khun; Tahmasebi, Houman; Adeli, Khosrow

    2018-03-01

    Evidence-based reference intervals (RIs) are essential to accurately interpret pediatric laboratory test results. To fill gaps in pediatric RIs, the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) project developed an age- and sex-specific pediatric RI database based on healthy pediatric subjects. Originally established for Abbott ARCHITECT assays, CALIPER RIs were transferred to assays on Beckman, Roche, Siemens, and Ortho analytical platforms. This study provides transferred reference intervals for 29 biochemical assays for the Ortho VITROS 5600 Chemistry System (Ortho). Based on Clinical Laboratory Standards Institute (CLSI) guidelines, a method comparison analysis was performed by measuring approximately 200 patient serum samples using Abbott and Ortho assays. The equation of the line of best fit was calculated and the appropriateness of the linear model was assessed. This equation was used to transfer RIs from Abbott to Ortho assays. Transferred RIs were verified using 84 healthy pediatric serum samples from the CALIPER cohort. RIs for most chemistry analytes successfully transferred from Abbott to Ortho assays. Calcium and CO 2 did not meet statistical criteria for transference (r 2 reference intervals, 29 successfully verified with approximately 90% of results from reference samples falling within transferred confidence limits. Transferred RIs for total bilirubin, magnesium, and LDH did not meet verification criteria and are not reported. This study broadens the utility of the CALIPER pediatric RI database to laboratories using Ortho VITROS 5600 biochemical assays. Clinical laboratories should verify CALIPER reference intervals for their specific analytical platform and local population as recommended by CLSI. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  11. Imputation of microsatellite alleles from dense SNP genotypes for parental verification

    Directory of Open Access Journals (Sweden)

    Matthew eMcclure

    2012-08-01

    Full Text Available Microsatellite (MS markers have recently been used for parental verification and are still the international standard despite higher cost, error rate, and turnaround time compared with Single Nucleotide Polymorphisms (SNP-based assays. Despite domestic and international interest from producers and research communities, no viable means currently exist to verify parentage for an individual unless all familial connections were analyzed using the same DNA marker type (MS or SNP. A simple and cost-effective method was devised to impute MS alleles from SNP haplotypes within breeds. For some MS, imputation results may allow inference across breeds. A total of 347 dairy cattle representing 4 dairy breeds (Brown Swiss, Guernsey, Holstein, and Jersey were used to generate reference haplotypes. This approach has been verified (>98% accurate for imputing the International Society of Animal Genetics (ISAG recommended panel of 12 MS for cattle parentage verification across a validation set of 1,307 dairy animals.. Implementation of this method will allow producers and breed associations to transition to SNP-based parentage verification utilizing MS genotypes from historical data on parents where SNP genotypes are missing. This approach may be applicable to additional cattle breeds and other species that wish to migrate from MS- to SNP- based parental verification.

  12. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  13. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  14. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    International Nuclear Information System (INIS)

    Zhang, Y; Yin, F; Ren, L; Zhang, Y

    2016-01-01

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  15. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Y; Yin, F; Ren, L [Duke University Medical Center, Durham, NC (United States); Zhang, Y [UT Southwestern Medical Ctr at Dallas, Dallas, TX (United States)

    2016-06-15

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  16. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    International Nuclear Information System (INIS)

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-01-01

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm 2 square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm 2 , where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a

  17. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT.

    Science.gov (United States)

    Park, Justin C; Li, Jonathan G; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-01

    The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm(2) square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm(2), where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a volumetric modulated arc

  18. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  19. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  20. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  1. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  2. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    Science.gov (United States)

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  3. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  4. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  5. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  6. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  7. A preliminary study on cone beam CT image based treatment planning

    International Nuclear Information System (INIS)

    Padmanaban, Sriram; Jeevanandham, Prakash; Boopathy, Raghavendiran; Sukumar, Prabakar; Syam Kumar, S.A.; Kunjithapatham, Bhuvana; Nagarajan, Vivekanandan

    2008-01-01

    Kilovolt Cone beam computed tomography (CBCT) based on flat panel technology is primarily used for positioning verification. However it is required to evaluate the accuracy of dose calculation based on CBCT images for the purpose of re-planning in adaptive radiation therapy (ART). In this study, 3DCRT and IMRT plans were done using both the planning CT and CBCT images and the corresponding variations in dose and MUs were analyzed, hence evaluating the feasibility of using kilovolt CBCT for dose calculation and patient dose verification. (author)

  8. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  9. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  10. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  11. Intelligent Tools for Planning Knowledge base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  12. Verification of the Danish 1990, 2000 and 2010 emission inventory data

    DEFF Research Database (Denmark)

    Fauser, Patrik; Nielsen, Malene; Winther, Morten

    , agriculture, industry and waste. The data are based on the national greenhouse gas inventories for the years 1990 (base year), 2000 and 2010, as reported in 2012, and provided by the UNFCCC and EU. Inter-country comparison and time series consistency check of emissions and implied emission factors is made...... is made with data for energy consumption (Eurostat), agricultural statistics (Eurostat), industrial processes (UN) and waste disposal (OECD). Verification in this approach is a combination of qualitative and quantitative assessments and can assist to identify sectors and categories that require more...... for EU15 countries, excluding Luxemburg and including Norway and Switzerland and for some verification steps also including Australia, Canada, Japan, Russian Federation, USA and aggregated values for EU15 and EU27. National and inter-country verification and time trend consistency check of activity data...

  13. Agent Programming Languages and Logics in Agent-Based Simulation

    DEFF Research Database (Denmark)

    Larsen, John

    2018-01-01

    and social behavior, and work on verification. Agent-based simulation is an approach for simulation that also uses the notion of agents. Although agent programming languages and logics are much less used in agent-based simulation, there are successful examples with agents designed according to the BDI...

  14. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  15. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  16. A virtual-accelerator-based verification of a Monte Carlo dose calculation algorithm for electron beam treatment planning in homogeneous phantoms

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2006-01-01

    By introducing Monte Carlo (MC) techniques to the verification procedure of dose calculation algorithms in treatment planning systems (TPSs), problems associated with conventional measurements can be avoided and properties that are considered unmeasurable can be studied. The aim of the study is to implement a virtual accelerator, based on MC simulations, to evaluate the performance of a dose calculation algorithm for electron beams in a commercial TPS. The TPS algorithm is MC based and the virtual accelerator is used to study the accuracy of the algorithm in water phantoms. The basic test of the implementation of the virtual accelerator is successful for 6 and 12 MeV (γ < 1.0, 0.02 Gy/2 mm). For 18 MeV, there are problems in the profile data for some of the applicators, where the TPS underestimates the dose. For fields equipped with patient-specific inserts, the agreement is generally good. The exception is 6 MeV where there are slightly larger deviations. The concept of the virtual accelerator is shown to be feasible and has the potential to be a powerful tool for vendors and users

  17. Laser-based measuring equipment controlled by microcomputer

    International Nuclear Information System (INIS)

    Miron, N.; Sporea, D.; Velculescu, V.G.; Petre, M.

    1988-03-01

    Some laser-based measuring equipment controlled by microcomputer developed for industrial and scientific purposes are described. These equipments are intended for dial indicators verification, graduated rules measurement, and for very accurate measurement of the gravitational constant. (authors)

  18. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  19. Experimental verification of a GPC-LPV method with RLS and P1-TS fuzzy-based estimation for limiting the transient and residual vibration of a crane system

    Science.gov (United States)

    Smoczek, Jaroslaw

    2015-10-01

    The paper deals with the problem of reducing the residual vibration and limiting the transient oscillations of a flexible and underactuated system with respect to the variation of operating conditions. The comparative study of generalized predictive control (GPC) and fuzzy scheduling scheme developed based on the P1-TS fuzzy theory, local pole placement method and interval analysis of closed-loop system polynomial coefficients is addressed to the problem of flexible crane control. The two alternatives of a GPC-based method are proposed that enable to realize this technique either with or without a sensor of payload deflection. The first control technique is based on the recursive least squares (RLS) method applied to on-line estimate the parameters of a linear parameter varying (LPV) model of a crane dynamic system. The second GPC-based approach is based on a payload deflection feedback estimated using a pendulum model with the parameters interpolated using the P1-TS fuzzy system. Feasibility and applicability of the developed methods were confirmed through experimental verification performed on a laboratory scaled overhead crane.

  20. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  2. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V ampersand V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V ampersand V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission's Technical training Center, the University of Maryland, EG ampersand G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V ampersand V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V ampersand V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  5. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  6. Fingerprint Identification Using SIFT-Based Minutia Descriptors and Improved All Descriptor-Pair Matching

    Directory of Open Access Journals (Sweden)

    Jiuqiang Han

    2013-03-01

    Full Text Available The performance of conventional minutiae-based fingerprint authentication algorithms degrades significantly when dealing with low quality fingerprints with lots of cuts or scratches. A similar degradation of the minutiae-based algorithms is observed when small overlapping areas appear because of the quite narrow width of the sensors. Based on the detection of minutiae, Scale Invariant Feature Transformation (SIFT descriptors are employed to fulfill verification tasks in the above difficult scenarios. However, the original SIFT algorithm is not suitable for fingerprint because of: (1 the similar patterns of parallel ridges; and (2 high computational resource consumption. To enhance the efficiency and effectiveness of the algorithm for fingerprint verification, we propose a SIFT-based Minutia Descriptor (SMD to improve the SIFT algorithm through image processing, descriptor extraction and matcher. A two-step fast matcher, named improved All Descriptor-Pair Matching (iADM, is also proposed to implement the 1:N verifications in real-time. Fingerprint Identification using SMD and iADM (FISiA achieved a significant improvement with respect to accuracy in representative databases compared with the conventional minutiae-based method. The speed of FISiA also can meet real-time requirements.

  7. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  8. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  9. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  10. Design and implementation of embedded hardware accelerator for diagnosing HDL-CODE in assertion-based verification environment

    Directory of Open Access Journals (Sweden)

    C. U. Ngene

    2013-08-01

    Full Text Available The use of assertions for monitoring the designer’s intention in hardware description language (HDL model is gaining popularity as it helps the designer to observe internal errors at the output ports of the device under verification. During verification assertions are synthesised and the generated data are represented in a tabular forms. The amount of data generated can be enormous depending on the size of the code and the number of modules that constitute the code. Furthermore, to manually inspect these data and diagnose the module with functional violation is a time consuming process which negatively affects the overall product development time. To locate the module with functional violation within acceptable diagnostic time, the data processing and analysis procedure must be accelerated. In this paper a multi-array processor (hardware accelerator was designed and implemented in Virtex6 field programmable gate array (FPGA and it can be integrated into verification environment. The design was captured in very high speed integrated circuit HDL (VHDL. The design was synthesised with Xilinx design suite ISE 13.1 and simulated with Xilinx ISIM. The multi-array processor (MAP executes three logical operations (AND, OR, XOR and a one’s compaction operation on array of data in parallel. An improvement in processing and analysis time was recorded as compared to the manual procedure after the multi-array processor was integrated into the verification environment. It was also found that the multi-array processor which was developed as an Intellectual Property (IP core can also be used in applications where output responses and golden model that are represented in the form of matrices can be compared for searching, recognition and decision-making.

  11. EOPs at NPP Temelin: Analytical support for EOPs verification

    International Nuclear Information System (INIS)

    Bica, M.

    1999-01-01

    The process of implementation of symptom-based emergency operating procedures (EOPs) started at the NPP Temelin in 1993. The process has the following phases: development of symptom-based EOPs; EOPs verification; EOPs validation; operating personnel training; EOPs control and experience feedback. The development of Temelin specific EOPs was based on technology and know-how transfer using the Emergency Response Guidelines Methodology developed by the Westinghouse Owner Group. In this lecture the implementation of symptom-based EOPs in the the NPP Temelin is described

  12. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  13. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  14. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  15. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  16. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  17. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  18. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    Science.gov (United States)

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  19. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  20. Augmented reality-assisted skull base surgery.

    Science.gov (United States)

    Cabrilo, I; Sarrafzadeh, A; Bijlenga, P; Landis, B N; Schaller, K

    2014-12-01

    Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of augmented reality, present advantages over traditional point-based neuronavigation. However, this development has not yet made its way into routine surgical practice, possibly due to a lack of acquaintance with these systems. In this report, we illustrate the usefulness and easy application of augmented reality-based neuronavigation through a case example of a patient with a clivus chordoma. We also demonstrate how augmented reality can help throughout all phases of a skull base procedure, from the verification of neuronavigation accuracy to intraoperative image-guidance. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  1. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  2. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  3. Verification and validation of software related to nuclear power plant instrumentation and control

    International Nuclear Information System (INIS)

    1999-01-01

    This report is produced in response to a recommendation of the IAEA International Working Group on Nuclear Power Plant Control and Instrumentation. The report has objectives of providing practical guidance on the methods available for verification of the software and validation of computer based systems, and on how and when these methods can be effectively applied. It is meant for those who are in any way involved with the development, implementation, maintenance and use of software and computer based instrumentation and control systems in nuclear power plants. The report is intended to be used by designers, software producers, reviewers, verification and validation teams, assessors, plant operators and licensers of computer based systems

  4. Shape-based hand recognition approach using the morphological pattern spectrum

    Science.gov (United States)

    Ramirez-Cortes, Juan Manuel; Gomez-Gil, Pilar; Sanchez-Perez, Gabriel; Prieto-Castro, Cesar

    2009-01-01

    We propose the use of the morphological pattern spectrum, or pecstrum, as the base of a biometric shape-based hand recognition system. The system receives an image of the right hand of a subject in an unconstrained pose, which is captured with a commercial flatbed scanner. According to pecstrum property of invariance to translation and rotation, the system does not require the use of pegs for a fixed hand position, which simplifies the image acquisition process. This novel feature-extraction method is tested using a Euclidean distance classifier for identification and verification cases, obtaining 97% correct identification, and an equal error rate (EER) of 0.0285 (2.85%) for the verification mode. The obtained results indicate that the pattern spectrum represents a good feature-extraction alternative for low- and medium-level hand-shape-based biometric applications.

  5. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    Science.gov (United States)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  6. MRI-based treatment planning for radiotherapy: Dosimetric verification for prostate IMRT

    International Nuclear Information System (INIS)

    Chen, Lili; Price, Robert A.; Wang Lu; Li Jinsheng; Qin Lihong; McNeeley, Shawn; Ma, C.-M. Charlie; Freedman, Gary M.; Pollack, Alan

    2004-01-01

    Purpose: Magnetic resonance (MR) and computed tomography (CT) image fusion with CT-based dose calculation is the gold standard for prostate treatment planning. MR and CT fusion with CT-based dose calculation has become a routine procedure for intensity-modulated radiation therapy (IMRT) treatment planning at Fox Chase Cancer Center. The use of MRI alone for treatment planning (or MRI simulation) will remove any errors associated with image fusion. Furthermore, it will reduce treatment cost by avoiding redundant CT scans and save patient, staff, and machine time. The purpose of this study is to investigate the dosimetric accuracy of MRI-based treatment planning for prostate IMRT. Methods and materials: A total of 30 IMRT plans for 15 patients were generated using both MRI and CT data. The MRI distortion was corrected using gradient distortion correction (GDC) software provided by the vendor (Philips Medical System, Cleveland, OH). The same internal contours were used for the paired plans. The external contours were drawn separately between CT-based and MR imaging-based plans to evaluate the effect of any residual distortions on dosimetric accuracy. The same energy, beam angles, dose constrains, and optimization parameters were used for dose calculations for each paired plans using a treatment optimization system. The resulting plans were compared in terms of isodose distributions and dose-volume histograms (DVHs). Hybrid phantom plans were generated for both the CT-based plans and the MR-based plans using the same leaf sequences and associated monitor units (MU). The physical phantom was then irradiated using the same leaf sequences to verify the dosimetry accuracy of the treatment plans. Results: Our results show that dose distributions between CT-based and MRI-based plans were equally acceptable based on our clinical criteria. The absolute dose agreement for the planning target volume was within 2% between CT-based and MR-based plans and 3% between measured dose

  7. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  8. Signature-based store checking buffer

    Science.gov (United States)

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  9. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin Fangfang; Gao Qinghuai; Xie Huchen; Nelson, Diana F.; Yu Yan; Kwok, W. Edmund; Totterman, Saara; Schell, Michael C.; Rubin, Philip

    1998-01-01

    Purpose: To investigate a method for the generation of digitally reconstructed radiographs directly from MR images (DRR-MRI) to guide a computerized portal verification procedure. Methods and Materials: Several major steps were developed to perform an MR image-guided portal verification procedure. Initially, a wavelet-based multiresolution adaptive thresholding method was used to segment the skin slice-by-slice in MR brain axial images. Some selected anatomical structures, such as target volume and critical organs, were then manually identified and were reassigned to relatively higher intensities. Interslice information was interpolated with a directional method to achieve comparable display resolution in three dimensions. Next, a ray-tracing method was used to generate a DRR-MRI image at the planned treatment position, and the ray tracing was simply performed on summation of voxels along the ray. The skin and its relative positions were also projected to the DRR-MRI and were used to guide the search of similar features in the portal image. A Canny edge detector was used to enhance the brain contour in both portal and simulation images. The skin in the brain portal image was then extracted using a knowledge-based searching technique. Finally, a Chamfer matching technique was used to correlate features between DRR-MRI and portal image. Results: The MR image-guided portal verification method was evaluated using a brain phantom case and a clinical patient case. Both DRR-CT and DRR-MRI were generated using CT and MR phantom images with the same beam orientation and then compared. The matching result indicated that the maximum deviation of internal structures was less than 1 mm. The segmented results for brain MR slice images indicated that a wavelet-based image segmentation technique provided a reasonable estimation for the brain skin. For the clinical patient case with a given portal field, the MR image-guided verification method provided an excellent match between

  10. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  11. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  12. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  13. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  14. Using Model Replication to Improve the Reliability of Agent-Based Models

    Science.gov (United States)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  15. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  16. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    International Nuclear Information System (INIS)

    Woodruff, Henry C.; Fuangrod, Todsaporn; Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van; Bhatia, Shashank; Greer, Peter B.

    2015-01-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  17. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, Henry C., E-mail: henry.woodruff@newcastle.edu.au [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, University of Newcastle, New South Wales (Australia); Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van [Division of Medical Physics, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Bhatia, Shashank [Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia); Greer, Peter B. [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia)

    2015-11-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  18. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  19. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  20. An Efficient Topology-Based Algorithm for Transient Analysis of Power Grid

    KAUST Repository

    Yang, Lan

    2015-08-10

    In the design flow of integrated circuits, chip-level verification is an important step that sanity checks the performance is as expected. Power grid verification is one of the most expensive and time-consuming steps of chip-level verification, due to its extremely large size. Efficient power grid analysis technology is highly demanded as it saves computing resources and enables faster iteration. In this paper, a topology-base power grid transient analysis algorithm is proposed. Nodal analysis is adopted to analyze the topology which is mathematically equivalent to iteratively solving a positive semi-definite linear equation. The convergence of the method is proved.

  1. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  2. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  3. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    Energy Technology Data Exchange (ETDEWEB)

    Neal, B; Ahmed, M; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.

  4. A hydrochemical data base for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Early, T.O.; Mudd, R.D.; Spice, G.D.; Starr, D.L.

    1985-02-01

    This data package contains a complete listing of the Site Hydrochemical Data Base for water samples associated with the Basalt Waste Isolation Project (BWIP). In addition to the detailed chemical analyses are a summary description of the data base format, detailed descriptions of verification procedures used to check data entries, and detailed descriptions of validation procedures used to evaluate data quality

  5. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  6. Dense time discretization technique for verification of real time systems

    International Nuclear Information System (INIS)

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  7. Knowledge based management of technical specifications

    International Nuclear Information System (INIS)

    Fiedler, U.; Schalm, S.; Pranckeviciute, K.

    1992-01-01

    TechSPEX is a knowledge based advisory system for checking the status of a nuclear plant on compliance with the safety limits and the limiting conditions of operation. These prescripts for safe reactor operation exist as textual information. For the purpose of its operational use an explicit representation formalism is introduced. On this basis, various approaches of text retrieval are realized, condition based surveillance and control is supported too. Knowledge editing and verification modules ease the adaption to changing requirements. TechSPEX has been implemented in PROLOG. (author). 6 refs, 3 figs

  8. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  9. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object-o...... be integrated in computer-aided software engineering (CASE) tools for adding formally supported checking, transformation and generation facilities.......Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...

  10. Verification of cloud cover forecast with INSAT observation over ...

    Indian Academy of Sciences (India)

    and recent trends in forecast quality, improving ... general framework for forecast verification based on the joint ... clouds is given by POD, and FAR offers a metric for how often the .... Kain J S and Fritsch J M 1993 Convective parameterizations.

  11. A SAT-Based Algorithm for Reparameterization in Symbolic Simulation

    National Research Council Canada - National Science Library

    Chauhan, Pankaj; Kroening, Daniel; Clarke, Edmund

    2003-01-01

    .... Efficient SAT solvers have been applied successfully for many verification problems. This paper presents a novel SAT-based reparameterization algorithm that is largely immune to the large number of input variables that need to be quantified...

  12. The harmonics detection method based on neural network applied ...

    African Journals Online (AJOL)

    user

    Keywords: Artificial Neural Networks (ANN), p-q theory, (SAPF), Harmonics, Total ..... Genetic algorithm-based self-learning fuzzy PI controller for shunt active filter, ... Verification of global optimality of the OFC active power filters by means of ...

  13. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  14. BProVe: Tool support for business process verification

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    This demo introduces BProVe, a tool supporting automated verification of Business Process models. BProVe analysis is based on a formal operational semantics defined for the BPMN 2.0 modelling language, and is provided as a freely accessible service that uses open standard formats as input data...

  15. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  16. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  17. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  18. Verification of RRC Ki code package for neutronic calculations of WWER core with GD

    International Nuclear Information System (INIS)

    Aleshin, S.S.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Pavlov, V.I.; Pavlovitchev, A.M.; Sidorenko, V.D.; Tsvetkov, V.M.

    2001-01-01

    The report presented is concerned with verification results of TVS-M/PERMAK-A/BIPR-7A code package for WWERs neutronic calculation as applied to calculation of systems containing U-GD pins. The verification is based on corresponded benchmark calculations, data critical experiments and on operation data obtained WWER units with Gd. The comparison results are discussed (Authors)

  19. Research and Implementation of Automatic Fuzzy Garage Parking System Based on FPGA

    Directory of Open Access Journals (Sweden)

    Wang Kaiyu

    2016-01-01

    Full Text Available Because of many common scenes of reverse parking in real life, this paper presents a fuzzy controller which accommodates front and back adjustment of vehicle’s body attitude, and based on chaotic-genetic arithmetic to optimize the membership function of this controller, and get a vertical parking fuzzy controller whose simulation result is good .The paper makes the hardware-software embedded design for system based on Field-Programmable Gate Array (FPGA, and set up a 1:10 verification platform of smart car to verify the fuzzy garage parking system with real car. Verification results show that, the system can complete the parking task very well.

  20. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  1. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  2. BioTwist : overcoming severe distortions in ridge-based biometrics for succesful identification

    NARCIS (Netherlands)

    Kotzerke, J.

    2016-01-01

    This thesis focuses on ridge-based and highly distorted biometrics, the different chal-lenges involved in a verification of identity scenario, and how to overcome them. More specifically, we work on ridge-based biometrics in two different contexts: (i) newborn and infant biometrics and (ii) quality

  3. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  4. Quantum multi-signature protocol based on teleportation

    International Nuclear Information System (INIS)

    Wen Xiao-jun; Liu Yun; Sun Yu

    2007-01-01

    In this paper, a protocol which can be used in multi-user quantum signature is proposed. The scheme of signature and verification is based on the correlation of Greenberger-Horne-Zeilinger (GHZ) states and the controlled quantum teleportation. Different from the digital signatures, which are based on computational complexity, the proposed protocol has perfect security in the noiseless quantum channels. Compared to previous quantum signature schemes, this protocol can verify the signature independent of an arbitrator as well as realize multi-user signature together. (orig.)

  5. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  6. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  7. Verification of the Indicating Measuring Instruments Taking into Account their Instrumental Measurement Uncertainty

    Directory of Open Access Journals (Sweden)

    Zakharov Igor

    2017-12-01

    Full Text Available The specific features of the measuring instruments verification based on the results of their calibration are considered. It is noted that, in contrast to the verification procedure used in the legal metrology, the verification procedure for calibrated measuring instruments has to take into account the uncertainty of measurements into account. In this regard, a large number of measuring instruments, considered as those that are in compliance after verification in the legal metrology, turns out to be not in compliance after calibration. In this case, it is necessary to evaluate the probability of compliance of indicating measuring instruments. The procedure of compliance probability determination on the basis of the Monte Carlo method is considered. An example of calibration of a Vernier caliper is given.

  8. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    International Nuclear Information System (INIS)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K; Tessonnier, T; Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W; Bauer, J; Verhaegen, F

    2016-01-01

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  9. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Tessonnier, T [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W [LMU Munich, Munich, DE (Germany); Bauer, J [Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Heidelberg Ion-Beam Therapy Center, Heidelberg, DE (Germany); Verhaegen, F [Maastro Clinic, Maastricht (Netherlands)

    2016-06-15

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  10. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  11. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  12. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  13. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  14. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  15. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  17. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  18. A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming

    International Nuclear Information System (INIS)

    Meier, Horst; Laurischkat, Roman; Zhu Junhong

    2011-01-01

    One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi body system model and its included compensation method.

  19. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  20. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  1. A methodology for developing high-integrity knowledge base using document analysis and ECPN matrix analysis with backward simulation

    International Nuclear Information System (INIS)

    Park, Joo Hyun

    1999-02-01

    When transitions occur in large systems such as nuclear power plants (NPPs) or industrial process plants, it is often difficult to diagnose them. Various computer-based operator-aiding systems have been developed in order to help operators diagnose the transitions of the plants. In procedures for developing knowledge base system like operator-aiding systems, the knowledge acquisition and the knowledge base verification are core activities. This dissertation describes a knowledge acquisition method and a knowledge base verification method for developing high-integrity knowledge base system of NPP expert systems. The knowledge acquisition is one of the most difficult and time-consuming activities in developing knowledge base systems. There are two kinds of knowledge acquisition methods in view of knowledge sources. One is an acquisition method from human expert. This method, however, is not adequate to acquire the knowledge of NPP expert systems because the number of experts is not sufficient. In this work, we propose a novel knowledge acquisition method through documents analysis. The knowledge base can be built correctly, rapidly, and partially automatically through this method. This method is especially useful when it is difficult to find domain experts. Reliability of knowledge base systems depends on the quality of their knowledge base. Petri Net has been used to verify knowledge bases due to their formal outputs. The methods using Petri Net however are difficult to apply to large and complex knowledge bases because the Net becomes very large and complex. Also, with Petri Net, it is difficult to find proper input patterns that make anomalies occur. In order to overcome this difficulty, in this work, the anomaly candidates detection methods are developed based on Extended CPN (ECPN) matrix analysis. This work also defines the backward simulation of CPN to find compact input patterns for anomaly detection, which starts simulation from the anomaly candidates

  2. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    International Nuclear Information System (INIS)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  3. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  4. Symptom-based emergency operating procedures development for Ignalina NPP

    International Nuclear Information System (INIS)

    Kruglov, Y.

    1999-01-01

    In this paper and lecture are presented: (1) Introduction; (2) EOP project work stages and documentation; (3) Selection and justification of accident management strategy; (4) Content of EOP package; (5) Development of EOP package; (6) EOP package verification; (7) EOP package validation; (8) EOP training; (9) EOP implementation; (10) Conditions of symptom-based emergency operating producers package application and its interconnection with event-based emergency operating procedures; (11) Rules of EOP application; EOP maintenance

  5. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  6. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  7. Verification and validation of COBRA-SFS transient analysis capability

    International Nuclear Information System (INIS)

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0

  8. Verification Based on Set-Abstraction Using the AIF Framework

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    The AIF framework is a novel method for analyzing advanced security protocols, web services, and APIs, based a new abstract interpretation method. It consists of the specification language AIF and a translation/abstraction processes that produces a set of first-order Horn clauses. These can...

  9. Design of graphic and animation in game interface based on cultural ...

    African Journals Online (AJOL)

    Design of graphic and animation in game interface based on cultural value: verification. ... Abstract. No Abstract. Keywords: game interface; cultural value; hofstede; prototype; eye tracker ... AJOL African Journals Online. HOW TO USE AJOL.

  10. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  11. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  12. Formal Verification and Implementation of Real-Time Applications

    Directory of Open Access Journals (Sweden)

    Liviu Haţegan

    2009-12-01

    Full Text Available This paper presents a method for the formal description, verification and automatic source code generation of embedded real-time multitasking applications, based on a model consisting of networks of timed automata. The model describes a real-time operating system kernel and application tasks, taking into consideration both non-preemptive and preemptive scheduling. The timing properties of theproposed model can be verified using a modelchecking tool. We also provide a solution for C source code generation based on the application’s model. For this purpose a unified resource access interface was implemented.

  13. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  14. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  15. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  16. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.

    Science.gov (United States)

    Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-05-22

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  17. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hongfei Zhu

    2018-05-01

    Full Text Available With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  18. Single-Phase LLCL-Filter-based Grid-Tied Inverter with Low-Pass Filter Based Capacitor Current Feedback Active damper

    DEFF Research Database (Denmark)

    Liu, Yuan; Wu, Weimin; Li, Yun

    2016-01-01

    The capacitor-current-feedback active damping method is attractive for high-order-filter-based high power grid-tied inverter when the grid impedance varies within a wide range. In order to improve the system control bandwidth and attenuate the high order grid background harmonics by using the quasi....... In this paper, a low pass filter is proposed to be inserted in the capacitor current feedback loop op LLCL-filter based grid-tied inverter together with a digital proportional and differential compensator. The detailed theoretical analysis is given. For verification, simulations on a 2kW/220V/10kHz LLCL...

  19. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  20. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  1. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  2. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  3. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  4. Development of synchronized control method for shaking table with booster device. Verification of the capabilities based on both real facility and numerical simulator

    International Nuclear Information System (INIS)

    Kajii, Shin-ichirou; Yasuda, Chiaki; Yamashita, Toshio; Abe, Hiroshi; Kanki, Hiroshi

    2004-01-01

    In the seismic design of nuclear power plant, it is recently considered to use probability method in a addition to certainty method. The former method is called Seismic Probability Safety Assessment (Seismic PSA). In case of seismic PSA for some components of a nuclear power plant using a shaking table, it is necessary for some limited conditions with high level of accelerations such as actual conditions. However, it might be difficult to achieve the test conditions that a current shaking table based on hydraulic power system is intended for the test facility. Therefore, we have been planning out a test method in which both a current and another shaking table called a booster device are applied. This paper describes the verification test of a synchronized control between a current shaking table and a booster device. (author)

  5. Nuclear power plant C and I design verification by simulation

    International Nuclear Information System (INIS)

    Storm, Joachim; Yu, Kim; Lee, D.Y

    2003-01-01

    An important part of the Advanced Boiling Water Reactor (ABWR) in the Taiwan NPP Lungmen Units no.1 and no.2 is the Full Scope Simulator (FSS). The simulator was to be built according to design data and therefore, apart from the training aspect, a major part of the development is to apply a simulation based test bed for the verification, validation and improvement of plant design in the control and instrumentation (C and I) areas of unit control room equipment, operator Man Machine Interface (MMI), process computer functions and plant procedures. Furthermore the Full Scope Simulator will be used after that to allow proper training of the plant operators two years before Unit no.1 fuel load. The article describes scope, methods and results of the advanced verification and validation process and highlights the advantages of test bed simulation for real power plant design and implementation. Subsequent application of advanced simulation software tools like instrumentation and control translators, graphical model builders, process models, graphical on-line test tools and screen based or projected soft panels, allowed a team to fulfil the task of C and I verification in time before the implementation of the Distributed Control and Information System (DCIS) started. An additional area of activity was the Human Factors Engineering (HFE) for the operator MMI. Due to the fact that the ABWR design incorporates a display-based operation with most of the plant components, a dedicated verification and validation process is required by NUREG-0711. In order to support this activity an engineering test system had been installed for all the necessary HFE investigations. All detected improvements had been properly documented and used to update the plant design documentation by a defined process. The Full Scope Simulator (FSS) with hard panels and stimulated digital control and information system are in the final acceptance test process with the end customer, Taiwan Power Company

  6. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine

    International Nuclear Information System (INIS)

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6 MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, − 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3 mm criteria. The mean and standard deviation of pixels passing

  7. Development and verification of symptom based emergency procedure support system

    International Nuclear Information System (INIS)

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  8. Verification of a primary-to-secondary leaking safety procedure in a nuclear power plant using coloured Petri nets

    International Nuclear Information System (INIS)

    Nemeth, E.; Bartha, T.; Fazekas, Cs.; Hangos, K.M.

    2009-01-01

    This paper deals with formal and simulation-based verification methods of a PRImary-to-SEcondary leaking (abbreviated as PRISE) safety procedure. The PRISE safety procedure controls the draining of the contaminated water in a faulty steam generator when a non-compensable leaking from the primary to the secondary circuit occurs. Because of the discrete nature of the verification, a Coloured Petri Net (CPN) representation is proposed for both the procedure and the plant model. We have proved by using a non-model-based strategy that the PRISE safety procedure is safe, there are no dead markings in the state space, and all transitions are live; being either impartial or fair. Further analysis results have been obtained using a model-based verification approach. We created a simple, low dimensional, nonlinear dynamic model of the primary circuit in a VVER-type pressurized water nuclear power plant for the purpose of the model-based verification. This is in contrast to the widely used safety analysis that requires an accurate detailed model. Our model also describes the relevant safety procedures, as well as all of the major leaking-type faults. We propose a novel method to transform this model to a CPN form by discretization. The composed plant and PRISE safety procedure system has also been analysed by simulation using CPN analysis tools. We found by the model-based analysis-using both single and multiple faults-that the PRISE safety procedure initiates the draining when the PRISE event occurs, and no false alarm will be initiated

  9. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    Science.gov (United States)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For

  10. Metric-based approach and tool for modeling the I and C system using Markov chains

    International Nuclear Information System (INIS)

    Butenko, Valentyna; Kharchenko, Vyacheslav; Odarushchenko, Elena; Butenko, Dmitriy

    2015-01-01

    Markov's chains (MC) are well-know and widely applied in dependability and performability analysis of safety-critical systems, because of the flexible representation of system components dependencies and synchronization. There are few radblocks for greater application of the MC: accounting the additional system components increases the model state-space and complicates analysis; the non-numerically sophisticated user may find it difficult to decide between the variety of numerical methods to determine the most suitable and accurate for their application. Thus obtaining the high accurate and trusted modeling results becomes a nontrivial task. In this paper, we present the metric-based approach for selection of the applicable solution approach, based on the analysis of MCs stiffness, decomposability, sparsity and fragmentedness. Using this selection procedure the modeler can provide the verification of earlier obtained results. The presented approach was implemented in utility MSMC, which supports the MC construction, metric-based analysis, recommendations shaping and model solution. The model can be exported to the wall-known off-the-shelf mathematical packages for verification. The paper presents the case study of the industrial NPP I and C system, manufactured by RPC Radiy. The paper shows an application of metric-based approach and MSMC fool for dependability and safety analysis of RTS, and procedure of results verification. (author)

  11. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  12. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    Science.gov (United States)

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  13. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications

    Directory of Open Access Journals (Sweden)

    Kobayashi Hiroki

    2012-04-01

    Full Text Available Abstract Background Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. Results We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system. As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. Conclusions This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  14. Overview of 3-year experience with large-scale electronic portal imaging device-based 3-dimensional transit dosimetry

    NARCIS (Netherlands)

    Mijnheer, Ben J.; González, Patrick; Olaciregui-Ruiz, Igor; Rozendaal, Roel A.; van Herk, Marcel; Mans, Anton

    2015-01-01

    To assess the usefulness of electronic portal imaging device (EPID)-based 3-dimensional (3D) transit dosimetry in a radiation therapy department by analyzing a large set of dose verification results. In our institution, routine in vivo dose verification of all treatments is performed by means of 3D

  15. Development of RPS trip logic based on PLD technology

    International Nuclear Information System (INIS)

    Choi, Jong Gyun; Lee, Dong Young

    2012-01-01

    The majority of instrumentation and control (I and C) systems in today's nuclear power plants (NPPs) are based on analog technology. Thus, most existing I and C systems now face obsolescence problems. Existing NPPs have difficulty in repairing and replacing devices and boards during maintenance because manufacturers no longer produce the analog devices and boards used in the implemented I and C systems. Therefore, existing NPPs are replacing the obsolete analog I and C systems with advanced digital systems. New NPPs are also adopting digital I and C systems because the economic efficiencies and usability of the systems are higher than the analog I and C systems. Digital I and C systems are based on two technologies: a microprocessor based system in which software programs manage the required functions and a programmable logic device (PLD) based system in which programmable logic devices, such as field programmable gate arrays, manage the required functions. PLD based systems provide higher levels of performance compared with microprocessor based systems because PLD systems can process the data in parallel while microprocessor based systems process the data sequentially. In this research, a bistable trip logic in a reactor protection system (RPS) was developed using very high speed integrated circuits hardware description language (VHDL), which is a hardware description language used in electronic design to describe the behavior of the digital system. Functional verifications were also performed in order to verify that the bistable trip logic was designed correctly and satisfied the required specifications. For the functional verification, a random testing technique was adopted to generate test inputs for the bistable trip logic.

  16. Calibration and verification of thermographic cameras for geometric measurements

    Science.gov (United States)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  17. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  18. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

  19. Clinical Implementation of a Model-Based In Vivo Dose Verification System for Stereotactic Body Radiation Therapy–Volumetric Modulated Arc Therapy Treatments Using the Electronic Portal Imaging Device

    Energy Technology Data Exchange (ETDEWEB)

    McCowan, Peter M., E-mail: pmccowan@cancercare.mb.ca [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Asuni, Ganiyu [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Van Uytven, Eric [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); VanBeek, Timothy [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); McCurdy, Boyd M.C. [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Loewen, Shaun K. [Department of Oncology, University of Calgary, Calgary, Alberta (Canada); Ahmed, Naseer; Bashir, Bashir; Butler, James B.; Chowdhury, Amitava; Dubey, Arbind; Leylek, Ahmet; Nashed, Maged [CancerCare Manitoba, Winnipeg, Manitoba (Canada)

    2017-04-01

    Purpose: To report findings from an in vivo dosimetry program implemented for all stereotactic body radiation therapy patients over a 31-month period and discuss the value and challenges of utilizing in vivo electronic portal imaging device (EPID) dosimetry clinically. Methods and Materials: From December 2013 to July 2016, 117 stereotactic body radiation therapy–volumetric modulated arc therapy patients (100 lung, 15 spine, and 2 liver) underwent 602 EPID-based in vivo dose verification events. A developed model-based dose reconstruction algorithm calculates the 3-dimensional dose distribution to the patient by back-projecting the primary fluence measured by the EPID during treatment. The EPID frame-averaging was optimized in June 2015. For each treatment, a 3%/3-mm γ comparison between our EPID-derived dose and the Eclipse AcurosXB–predicted dose to the planning target volume (PTV) and the ≥20% isodose volume were performed. Alert levels were defined as γ pass rates <85% (lung and liver) and <80% (spine). Investigations were carried out for all fractions exceeding the alert level and were classified as follows: EPID-related, algorithmic, patient setup, anatomic change, or unknown/unidentified errors. Results: The percentages of fractions exceeding the alert levels were 22.6% for lung before frame-average optimization and 8.0% for lung, 20.0% for spine, and 10.0% for liver after frame-average optimization. Overall, mean (± standard deviation) planning target volume γ pass rates were 90.7% ± 9.2%, 87.0% ± 9.3%, and 91.2% ± 3.4% for the lung, spine, and liver patients, respectively. Conclusions: Results from the clinical implementation of our model-based in vivo dose verification method using on-treatment EPID images is reported. The method is demonstrated to be valuable for routine clinical use for verifying delivered dose as well as for detecting errors.

  20. Model Checking and Model-based Testing in the Railway Domain

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    This chapter describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k...... with good test strength are explained. Interlocking systems represent just one class of many others, where concrete system instances are created from generic representations, using configuration data for determining the behaviour of the instances. We explain how the systematic transition from generic...... to concrete instances in the development path is complemented by associated transitions in the verification and testing paths....

  1. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    International Nuclear Information System (INIS)

    Brau-Avila, A; Valenzuela-Galvan, M; Herrera-Jimenez, V M; Santolaria, J; Aguilar, J J; Acero, R

    2017-01-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs. (paper)

  2. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    Science.gov (United States)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  3. Getting ready for final disposal in Finland - Independent verification of spent fuel

    International Nuclear Information System (INIS)

    Tarvainen, Matti; Honkamaa, Tapani; Martikka, Elina; Varjoranta, Tero; Hautamaeki, Johanna; Tiitta, Antero

    2001-01-01

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  4. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  5. FPGA Design and Verification Procedure for Nuclear Power Plant MMIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dongil; Yoo, Kawnwoo; Ryoo, Kwangki [Hanbat National Univ., Daejeon (Korea, Republic of)

    2013-05-15

    In this paper, it is shown that it is possible to ensure reliability by performing the steps of the verification based on the FPGA development methodology, to ensure the safety of application to the NPP MMIS of the FPGA run along the step. Currently, the PLC (Programmable Logic Controller) which is being developed is composed of the FPGA (Field Programmable Gate Array) and CPU (Central Processing Unit). As the importance of the FPGA in the NPP (Nuclear Power Plant) MMIS (Man-Machine Interface System) has been increasing than before, the research on the verification of the FPGA has being more and more concentrated recently.

  6. A virtual linear accelerator for verification of treatment planning systems

    International Nuclear Information System (INIS)

    Wieslander, Elinore

    2000-01-01

    A virtual linear accelerator is implemented into a commercial pencil-beam-based treatment planning system (TPS) with the purpose of investigating the possibility of verifying the system using a Monte Carlo method. The characterization set for the TPS includes depth doses, profiles and output factors, which is generated by Monte Carlo simulations. The advantage of this method over conventional measurements is that variations in accelerator output are eliminated and more complicated geometries can be used to study the performance of a TPS. The difference between Monte Carlo simulated and TPS calculated profiles and depth doses in the characterization geometry is less than ±2% except for the build-up region. This is of the same order as previously reported results based on measurements. In an inhomogeneous, mediastinum-like case, the deviations between TPS and simulations are small in the unit-density regions. In low-density regions, the TPS overestimates the dose, and the overestimation increases with increasing energy from 3.5% for 6 MV to 9.5% for 18 MV. This result points out the widely known fact that the pencil beam concept does not handle changes in lateral electron transport, nor changes in scatter due to lateral inhomogeneities. It is concluded that verification of a pencil-beam-based TPS with a Monte Carlo based virtual accelerator is possible, which facilitates the verification procedure. (author)

  7. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  8. The effect of individually-induced processes on image-based overlay and diffraction-based overlay

    Science.gov (United States)

    Oh, SeungHwa; Lee, Jeongjin; Lee, Seungyoon; Hwang, Chan; Choi, Gilheyun; Kang, Ho-Kyu; Jung, EunSeung

    2014-04-01

    In this paper, set of wafers with separated processes was prepared and overlay measurement result was compared in two methods; IBO and DBO. Based on the experimental result, theoretical approach of relationship between overlay mark deformation and overlay variation is presented. Moreover, overlay reading simulation was used in verification and prediction of overlay variation due to deformation of overlay mark caused by induced processes. Through this study, understanding of individual process effects on overlay measurement error is given. Additionally, guideline of selecting proper overlay measurement scheme for specific layer is presented.

  9. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  10. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  11. Correctness-by-construction and post-hoc verification : a marriage of convenience?

    NARCIS (Netherlands)

    Watson, B.W.; Kourie, D.G.; Schaefer, I.; Cleophas, L.G.W.A.; Margaria, T.; Steffen, B.

    2016-01-01

    Correctness-by-construction (CbC), traditionally based on weakest precondition semantics, and post-hoc verification (PhV) aspire to ensure functional correctness. We argue for a lightweight approach to CbC where lack of formal rigour increases productivity. In order to mitigate the risk of

  12. Verification of rapid method for estimation of added food colorant type in boiled sausages based on measurement of cross section color

    Science.gov (United States)

    Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.

    2017-09-01

    During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.

  13. OFCC based voltage and transadmittance mode instrumentation amplifier

    Science.gov (United States)

    Nand, Deva; Pandey, Neeta; Pandey, Rajeshwari; Tripathi, Prateek; Gola, Prashant

    2017-07-01

    The operational floating current conveyor (OFCC) is a versatile active block due to the availability of both low and high input and output impedance terminals. This paper addresses the realization of OFCC based voltage and transadmittance mode instrumentation amplifiers (VMIA and TAM IA). It employs three OFCCs and seven resistors. The transadmittance mode operation can easily be obtained by simply connecting an OFCC based voltage to current converter at the output. The effect of non-idealities of OFCC, in particular finite transimpedance and tracking error, on system performance is also dealt with and corresponding mathematical expressions are derived. The functional verification is performed through SPICE simulation using CMOS based implementation of OFCC.

  14. Vision based techniques for rotorcraft low altitude flight

    Science.gov (United States)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  15. Removing Unnecessary Variables from Horn Clause Verification Conditions

    Directory of Open Access Journals (Sweden)

    Emanuele De Angelis

    2016-07-01

    Full Text Available Verification conditions (VCs are logical formulas whose satisfiability guarantees program correctness. We consider VCs in the form of constrained Horn clauses (CHC which are automatically generated from the encoding of (an interpreter of the operational semantics of the programming language. VCs are derived through program specialization based on the unfold/fold transformation rules and, as it often happens when specializing interpreters, they contain unnecessary variables, that is, variables which are not required for the correctness proofs of the programs under verification. In this paper we adapt to the CHC setting some of the techniques that were developed for removing unnecessary variables from logic programs, and we show that, in some cases, the application of these techniques increases the effectiveness of Horn clause solvers when proving program correctness.

  16. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  17. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  18. Smartphone User Identity Verification Using Gait Characteristics

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2016-09-01

    Full Text Available Smartphone-based biometrics offers a wide range of possible solutions, which could be used to authenticate users and thus to provide an extra level of security and theft prevention. We propose a method for positive identification of smartphone user’s identity using user’s gait characteristics captured by embedded smartphone sensors (gyroscopes, accelerometers. The method is based on the application of the Random Projections method for feature dimensionality reduction to just two dimensions. Then, a probability distribution function (PDF of derived features is calculated, which is compared against known user PDF. The Jaccard distance is used to evaluate distance between two distributions, and the decision is taken based on thresholding. The results for subject recognition are at an acceptable level: we have achieved a grand mean Equal Error Rate (ERR for subject identification of 5.7% (using the USC-HAD dataset. Our findings represent a step towards improving the performance of gait-based user identity verification technologies.

  19. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  20. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  1. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  2. Examining the examiners: an online eyebrow verification experiment inspired by FISWG

    NARCIS (Netherlands)

    Zeinstra, Christopher Gerard; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    In forensic face comparison, one of the features taken into account are the eyebrows. In this paper, we investigate human performance on an eyebrow verification task. This task is executed twice by participants: a "best-effort" approach and an approach using features based on forensic knowledge. The

  3. Ongoing Work on Automated Verification of Noisy Nonlinear Systems with Ariadne

    NARCIS (Netherlands)

    Geretti, Luca; Bresolin, Davide; Collins, Pieter; Zivanovic Gonzalez, Sanja; Villa, Tiziano

    2017-01-01

    Cyber-physical systems (CPS) are hybrid systems that commonly consist of a discrete control part that operates in a continuous environment. Hybrid automata are a convenient model for CPS suitable for formal verification. The latter is based on reachability analysis of the system to trace its hybrid

  4. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  5. Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform

    Science.gov (United States)

    Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.

    2012-12-01

    This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.

  6. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  7. Using Reversed MFCC and IT-EM for Automatic Speaker Verification

    Directory of Open Access Journals (Sweden)

    Sheeraz Memon

    2012-01-01

    Full Text Available This paper proposes text independent automatic speaker verification system using IMFCC (Inverse/ Reverse Mel Frequency Coefficients and IT-EM (Information Theoretic Expectation Maximization. To perform speaker verification, feature extraction using Mel scale has been widely applied and has established better results. The IMFCC is based on inverse Mel-scale. The IMFCC effectively captures information available at the high frequency formants which is ignored by the MFCC. In this paper the fusion of MFCC and IMFCC at input level is proposed. GMMs (Gaussian Mixture Models based on EM (Expectation Maximization have been widely used for classification of text independent verification. However EM comes across the convergence issue. In this paper we use our proposed IT-EM which has faster convergence, to train speaker models. IT-EM uses information theory principles such as PDE (Parzen Density Estimation and KL (Kullback-Leibler divergence measure. IT-EM acclimatizes the weights, means and covariances, like EM. However, IT-EM process is not performed on feature vector sets but on a set of centroids obtained using IT (Information Theoretic metric. The IT-EM process at once diminishes divergence measure between PDE estimates of features distribution within a given class and the centroids distribution within the same class. The feature level fusion and IT-EM is tested for the task of speaker verification using NIST2001 and NIST2004. The experimental evaluation validates that MFCC/IMFCC has better results than the conventional delta/MFCC feature set. The MFCC/IMFCC feature vector size is also much smaller than the delta MFCC thus reducing the computational burden as well. IT-EM method also showed faster convergence, than the conventional EM method, and thus it leads to higher speaker recognition scores.

  8. Monte Carlo simulations to replace film dosimetry in IMRT verification

    International Nuclear Information System (INIS)

    Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)

  9. Age verification cards fail to fully prevent minors from accessing tobacco products.

    Science.gov (United States)

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  10. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  11. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  12. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    Science.gov (United States)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  13. A web-based system for supporting global land cover data production

    Science.gov (United States)

    Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu

    2015-05-01

    Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.

  14. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed...

  15. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  16. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  17. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  18. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  19. Advances in the Processing of VHR Optical Imagery in Support of Safeguards Verification

    International Nuclear Information System (INIS)

    Niemeyer, I.; Listner, C.; Canty, M.

    2015-01-01

    Under the Additional Protocol of the Non-Proliferation Treaty (NPT) complementing the safeguards agreements between States and the International Atomic Energy Agency, commercial satellite imagery, preferably acquired by very high-resolution (VHR) satellite sensors, is an important source of safeguards-relevant information. Satellite imagery can assist in the evaluation of site declarations, design information verification, the detection of undeclared nuclear facilities, and the preparation of inspections or other visits. With the IAEA's Geospatial Exploitation System (GES), satellite imagery and other geospatial information such as site plans of nuclear facilities are available for a broad range of inspectors, analysts and country officers. The demand for spatial information and new tools to analyze this data is growing, together with the rising number of nuclear facilities under safeguards worldwide. Automated computer-driven processing of satellite imagery could therefore add a big value in the safeguards verification process. These could be, for example, satellite imagery pre-processing algorithms specially developed for new sensors, tools for pixel or object-based image analysis, or geoprocessing tools that generate additional safeguards-relevant information. In the last decade procedures for automated (pre-) processing of satellite imagery have considerably evolved. This paper aims at testing some pixel-based and object-based procedures for automated change detection and classification in support of safeguards verification. Taking different nuclear sites as examples, these methods will be evaluated and compared with regard to their suitability to (semi-) automatically extract safeguards-relevant information. (author)

  20. Assertion checking environment (ACE) for formal verification of C programs

    International Nuclear Information System (INIS)

    Sharma, Babita; Dhodapkar, S.D.; Ramesh, S.

    2003-01-01

    In this paper we describe an Assertion Checking Environment (ACE) for compositional verification of programs, which are written in an industrially sponsored safe subset of C programming language called MISRA C [Guidelines for the Use of the C Language in Vehicle Based Software, 1998]. The theory is based on Hoare logic [Commun. ACM 12 (1969) 576] and the C programs are verified using static assertion checking technique. First the functional specifications of the program, captured in the form of pre- and post-conditions for each C function, are derived from the specifications. These pre- and post-conditions are then introduced as assertions (also called annotations or formal comments) in the program code. The assertions are then proved formally using ACE and theorem proving tool called Stanford Temporal Prover [The Stanford Temporal Prover User's Manual, 1998]. ACE has been developed by us and consists mainly of a translator c2spl, a GUI and some utility programs. The technique and tools developed are targeted towards verification of real-time embedded software

  1. Application of model-based and knowledge-based measuring methods as analytical redundancy

    International Nuclear Information System (INIS)

    Hampel, R.; Kaestner, W.; Chaker, N.; Vandreier, B.

    1997-01-01

    The safe operation of nuclear power plants requires the application of modern and intelligent methods of signal processing for the normal operation as well as for the management of accident conditions. Such modern and intelligent methods are model-based and knowledge-based ones being founded on analytical knowledge (mathematical models) as well as experiences (fuzzy information). In addition to the existing hardware redundancies analytical redundancies will be established with the help of these modern methods. These analytical redundancies support the operating staff during the decision-making. The design of a hybrid model-based and knowledge-based measuring method will be demonstrated by the example of a fuzzy-supported observer. Within the fuzzy-supported observer a classical linear observer is connected with a fuzzy-supported adaptation of the model matrices of the observer model. This application is realized for the estimation of the non-measurable variables as steam content and mixture level within pressure vessels with water-steam mixture during accidental depressurizations. For this example the existing non-linearities will be classified and the verification of the model will be explained. The advantages of the hybrid method in comparison to the classical model-based measuring methods will be demonstrated by the results of estimation. The consideration of the parameters which have an important influence on the non-linearities requires the inclusion of high-dimensional structures of fuzzy logic within the model-based measuring methods. Therefore methods will be presented which allow the conversion of these high-dimensional structures to two-dimensional structures of fuzzy logic. As an efficient solution of this problem a method based on cascaded fuzzy controllers will be presented. (author). 2 refs, 12 figs, 5 tabs

  2. Simulation-Based Performance Evaluation of Predictive-Hashing Based Multicast Authentication Protocol

    Directory of Open Access Journals (Sweden)

    Seonho Choi

    2012-12-01

    Full Text Available A predictive-hashing based Denial-of-Service (DoS resistant multicast authentication protocol was proposed based upon predictive-hashing, one-way key chain, erasure codes, and distillation codes techniques [4, 5]. It was claimed that this new scheme should be more resistant to various types of DoS attacks, and its worst-case resource requirements were derived in terms of coarse-level system parameters including CPU times for signature verification and erasure/distillation decoding operations, attack levels, etc. To show the effectiveness of our approach and to analyze exact resource requirements in various attack scenarios with different parameter settings, we designed and implemented an attack simulator which is platformindependent. Various attack scenarios may be created with different attack types and parameters against a receiver equipped with the predictive-hashing based protocol. The design of the simulator is explained, and the simulation results are presented with detailed resource usage statistics. In addition, resistance level to various types of DoS attacks is formulated with a newly defined resistance metric. By comparing these results to those from another approach, PRABS [8], we show that the resistance level of our protocol is greatly enhanced even in the presence of many attack streams.

  3. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  4. Efficient family-based model checking via variability abstractions

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2016-01-01

    with the abstract model checking of the concrete high-level variational model. This allows the use of Spin with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We have implemented the transformations in a prototype tool, and we illustrate......Many software systems are variational: they can be configured to meet diverse sets of requirements. They can produce a (potentially huge) number of related systems, known as products or variants, by systematically reusing common parts. For variational models (variational systems or families...... of related systems), specialized family-based model checking algorithms allow efficient verification of multiple variants, simultaneously, in a single run. These algorithms, implemented in a tool Snip, scale much better than ``the brute force'' approach, where all individual systems are verified using...

  5. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  6. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  8. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  9. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    International Nuclear Information System (INIS)

    Yamashita, M; Kokubo, M; Takahashi, R; Takayama, K; Tanabe, H; Sueoka, M; Okuuchi, N; Ishii, M; Iwamoto, Y; Tachibana, H

    2016-01-01

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  10. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    Energy Technology Data Exchange (ETDEWEB)

    Yamashita, M; Kokubo, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takayama, K [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tanabe, H; Sueoka, M; Okuuchi, N [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Ishii, M; Iwamoto, Y [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  11. The use of measurement uncertainty in nuclear materials accuracy and verification

    International Nuclear Information System (INIS)

    Alique, O.; Vaccaro, S.; Svedkauskaite, J.

    2015-01-01

    EURATOM nuclear safeguards are based on the nuclear operators’ accounting for and declaring of the amounts of nuclear materials in their possession, as well as on the European Commission verifying the correctness and completeness of such declarations by means of conformity assessment practices. Both the accountancy and the verification processes comprise the measurements of amounts and characteristics of nuclear materials. The uncertainties associated to these measurements play an important role in the reliability of the results of nuclear material accountancy and verification. The document “JCGM 100:2008 Evaluation of measurement data – Guide to the expression of uncertainty in measurement” - issued jointly by the International Bureau of Weights and Measures (BIPM) and international organisations for metrology, standardisation and accreditation in chemistry, physics and electro technology - describes a universal, internally consistent, transparent and applicable method for the evaluation and expression of uncertainty in measurements. This paper discusses different processes of nuclear materials accountancy and verification where measurement uncertainty plays a significant role. It also suggests the way measurement uncertainty could be used to enhance the reliability of the results of the nuclear materials accountancy and verification processes.

  12. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  13. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  14. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  15. Evidence-based hypnotherapy for depression.

    Science.gov (United States)

    Alladin, Assen

    2010-04-01

    Cognitive hypnotherapy (CH) is a comprehensive evidence-based hypnotherapy for clinical depression. This article describes the major components of CH, which integrate hypnosis with cognitive-behavior therapy as the latter provides an effective host theory for the assimilation of empirically supported treatment techniques derived from various theoretical models of psychotherapy and psychopathology. CH meets criteria for an assimilative model of psychotherapy, which is considered to be an efficacious model of psychotherapy integration. The major components of CH for depression are described in sufficient detail to allow replication, verification, and validation of the techniques delineated. CH for depression provides a template that clinicians and investigators can utilize to study the additive effects of hypnosis in the management of other psychological or medical disorders. Evidence-based hypnotherapy and research are encouraged; such a movement is necessary if clinical hypnosis is to integrate into mainstream psychotherapy.

  16. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  17. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  18. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  19. Lower and Upper Bounds in Zone-Based Abstractions of Timed Automata

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Bouyer, Patricia; Larsen, Kim Guldstrand

    2005-01-01

    The semantics of timed automata is defined using an infinite-state transition system. For verification purposes, one usually uses zone based abstractions w.r.t. the maximal constants to which clocks of the timed automaton are compared. We show that by distinguishing maximal lower and upper bounds...

  20. ENVIORNMENTAL TECHNOLOGY VERIFICATION REPORT: ANEST IWATA CORPORATION LPH400-LV HVLP SPRAY GUN

    Science.gov (United States)

    This Enviornmental Technology Verification reports on the characteristics of a paint spray gun. The research showed that the spray gun provided absolute and relative increases in transfer efficiency over the base line and provided a reduction in the use of paint.

  1. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  2. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    International Nuclear Information System (INIS)

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-01-01

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%

  3. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  4. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  5. Systemverilog for verification a guide to learning the testbench language features

    CERN Document Server

    Spear, Chris

    2012-01-01

    Based on the highly successful second edition, this extended edition of SystemVerilog for Verification: A Guide to Learning the Testbench Language Features teaches all verification features of the SystemVerilog language, providing hundreds of examples to clearly explain the concepts and basic fundamentals. It contains materials for both the full-time verification engineer and the student learning this valuable skill. In the third edition, authors Chris Spear and Greg Tumbush start with how to verify a design, and then use that context to demonstrate the language features,  including the advantages and disadvantages of different styles, allowing readers to choose between alternatives. This textbook contains end-of-chapter exercises designed to enhance students’ understanding of the material. Other features of this revision include: New sections on static variables, print specifiers, and DPI from the 2009 IEEE language standard Descriptions of UVM features such as factories, the test registry, and the config...

  6. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  7. V&V Plan for FPGA-based ESF-CCS Using System Engineering Approach.

    Science.gov (United States)

    Maerani, Restu; Mayaka, Joyce; El Akrat, Mohamed; Cheon, Jung Jae

    2018-02-01

    Instrumentation and Control (I&C) systems play an important role in maintaining the safety of Nuclear Power Plant (NPP) operation. However, most current I&C safety systems are based on Programmable Logic Controller (PLC) hardware, which is difficult to verify and validate, and is susceptible to software common cause failure. Therefore, a plan for the replacement of the PLC-based safety systems, such as the Engineered Safety Feature - Component Control System (ESF-CCS), with Field Programmable Gate Arrays (FPGA) is needed. By using a systems engineering approach, which ensures traceability in every phase of the life cycle, from system requirements, design implementation to verification and validation, the system development is guaranteed to be in line with the regulatory requirements. The Verification process will ensure that the customer and stakeholder’s needs are satisfied in a high quality, trustworthy, cost efficient and schedule compliant manner throughout a system’s entire life cycle. The benefit of the V&V plan is to ensure that the FPGA based ESF-CCS is correctly built, and to ensure that the measurement of performance indicators has positive feedback that “do we do the right thing” during the re-engineering process of the FPGA based ESF-CCS.

  8. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, Y [Tokai University School of Medicine, Isehara, Kanagawa (Japan)

    2015-06-15

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057.

  9. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    International Nuclear Information System (INIS)

    Fujita, Y

    2015-01-01

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057

  10. V&V Within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.

  11. A gradient based algorithm to solve inverse plane bimodular problems of identification

    Science.gov (United States)

    Ran, Chunjiang; Yang, Haitian; Zhang, Guoqing

    2018-02-01

    This paper presents a gradient based algorithm to solve inverse plane bimodular problems of identifying constitutive parameters, including tensile/compressive moduli and tensile/compressive Poisson's ratios. For the forward bimodular problem, a FE tangent stiffness matrix is derived facilitating the implementation of gradient based algorithms, for the inverse bimodular problem of identification, a two-level sensitivity analysis based strategy is proposed. Numerical verification in term of accuracy and efficiency is provided, and the impacts of initial guess, number of measurement points, regional inhomogeneity, and noisy data on the identification are taken into accounts.

  12. Generator of text-based assignments for programming courses

    OpenAIRE

    Jager, Mojca

    2013-01-01

    Verifying and assessing of knowledge represent important part of education. A teacher can verify knowledge on different ways, like classically oral and written, which are in majority still dominant types, and alternatively, which is based on student's current activities. During assembling the questions for written verification, many teachers nowadays help themselves using different test generators like Hot Potatoes, Moodle, Test Pilot, and others, they are all available on internet. At t...

  13. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    Science.gov (United States)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about

  14. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  15. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  16. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  17. Verification of space weather forecasts at the UK Met Office

    Science.gov (United States)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  18. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  19. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  20. Multi-Site Calibration of Linear Reservoir Based Geomorphologic Rainfall-Runoff Models

    Directory of Open Access Journals (Sweden)

    Bahram Saeidifarzad

    2014-09-01

    Full Text Available Multi-site optimization of two adapted event-based geomorphologic rainfall-runoff models was presented using Non-dominated Sorting Genetic Algorithm (NSGA-II method for the South Fork Eel River watershed, California. The first model was developed based on Unequal Cascade of Reservoirs (UECR and the second model was presented as a modified version of Geomorphological Unit Hydrograph based on Nash’s model (GUHN. Two calibration strategies were considered as semi-lumped and semi-distributed for imposing (or unimposing the geomorphology relations in the models. The results of models were compared with Nash’s model. Obtained results using the observed data of two stations in the multi-site optimization framework showed reasonable efficiency values in both the calibration and the verification steps. The outcomes also showed that semi-distributed calibration of the modified GUHN model slightly outperformed other models in both upstream and downstream stations during calibration. Both calibration strategies for the developed UECR model during the verification phase showed slightly better performance in the downstream station, but in the upstream station, the modified GUHN model in the semi-lumped strategy slightly outperformed the other models. The semi-lumped calibration strategy could lead to logical lag time parameters related to the basin geomorphology and may be more suitable for data-based statistical analyses of the rainfall-runoff process.

  1. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  2. Knowledge Representation and Inference for Analysis and Design of Database and Tabular Rule-Based Systems

    Directory of Open Access Journals (Sweden)

    Antoni Ligeza

    2001-01-01

    Full Text Available Rulebased systems constitute a powerful tool for specification of knowledge in design and implementation of knowledge based systems. They provide also a universal programming paradigm for domains such as intelligent control, decision support, situation classification and operational knowledge encoding. In order to assure safe and reliable performance, such system should satisfy certain formal requirements, including completeness and consistency. This paper addresses the issue of analysis and verification of selected properties of a class of such system in a systematic way. A uniform, tabular scheme of single-level rule-based systems is considered. Such systems can be applied as a generalized form of databases for specification of data pattern (unconditional knowledge, or can be used for defining attributive decision tables (conditional knowledge in form of rules. They can also serve as lower-level components of a hierarchical multi-level control and decision support knowledge-based systems. An algebraic knowledge representation paradigm using extended tabular representation, similar to relational database tables is presented and algebraic bases for system analysis, verification and design support are outlined.

  3. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yilin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  4. Evaluation of Face Detection Algorithms for the Bank Client Identity Verification

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2017-06-01

    Full Text Available Results of investigation of face detection algorithms efficiency in the banking client visual verification system are presented. The video recordings were made in real conditions met in three bank operating outlets employing a miniature industrial USB camera. The aim of the experiments was to check the practical usability of the face detection method in the biometric bank client verification system. The main assumption was to provide a simplified as much as possible user interaction with the application. Applied algorithms for face detection are described and achieved results of face detection in the real bank environment conditions are presented. Practical limitations of the application based on encountered problems are discussed.

  5. Phantoms for IMRT dose distribution measurement and treatment verification

    International Nuclear Information System (INIS)

    Low, Daniel A.; Gerber, Russell L.; Mutic, Sasa; Purdy, James A.

    1998-01-01

    Background: The verification of intensity-modulated radiation therapy (IMRT) patient treatment dose distributions is currently based on custom-built or modified dose measurement phantoms. The only commercially available IMRT treatment planning and delivery system (Peacock, NOMOS Corp.) is supplied with a film phantom that allows accurate spatial localization of the dose distribution using radiographic film. However, measurements using other dosimeters are necessary for the thorough verification of IMRT. Methods: We have developed a phantom to enable dose measurements using a cylindrical ionization chamber and the localization of prescription isodose curves using a matrix of thermoluminescent dosimetry (TLD) chips. The external phantom cross-section is identical to that of the commercial phantom, to allow direct comparisons of measurements. A supplementary phantom has been fabricated to verify the IMRT dose distributions for pelvis treatments. Results: To date, this phantom has been used for the verification of IMRT dose distributions for head and neck and prostate cancer treatments. Designs are also presented for a phantom insert to be used with polymerizing gels (e.g., BANG-2) to obtain volumetric dose distribution measurements. Conclusion: The phantoms have proven useful in the quantitative evaluation of IMRT treatments

  6. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  7. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.

    Science.gov (United States)

    Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan

    2018-03-02

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.

  8. Verification of the ECMWF ensemble forecasts of wind speed against analyses and observations

    DEFF Research Database (Denmark)

    Pinson, Pierre; Hagedorn, Renate

    2012-01-01

    A framework for the verification of ensemble forecasts of near-surface wind speed is described. It is based on existing scores and diagnostic tools, though considering observations from synoptic stations as reference instead of the analysis. This approach is motivated by the idea of having a user......-oriented view of verification, for instance with the wind power applications in mind. The verification framework is specifically applied to the case of ECMWF ensemble forecasts and over Europe. Dynamic climatologies are derived at the various stations, serving as a benchmark. The impact of observational...... uncertainty on scores and diagnostic tools is also considered. The interest of this framework is demonstrated from its application to the routine evaluation of ensemble forecasts and to the assessment of the quality improvements brought in by the recent change in horizontal resolution of the ECMWF ensemble...

  9. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  10. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  11. Experimental verification of internal dosimetry calculations. Annual progress report

    International Nuclear Information System (INIS)

    1980-05-01

    During the past year a dosimetry research program has been established in the School of Nuclear Engineering at the Georgia Institute of Technology. The major objective of this program has been to provide research results upon which a useful internal dosimetry system could be based. The important application of this dosimetry system will be the experimental verification of internal dosimetry calculations such as those published by the MIRD Committee

  12. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  13. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  14. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  15. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  16. Average Gait Differential Image Based Human Recognition

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available The difference between adjacent frames of human walking contains useful information for human gait identification. Based on the previous idea a silhouettes difference based human gait recognition method named as average gait differential image (AGDI is proposed in this paper. The AGDI is generated by the accumulation of the silhouettes difference between adjacent frames. The advantage of this method lies in that as a feature image it can preserve both the kinetic and static information of walking. Comparing to gait energy image (GEI, AGDI is more fit to representation the variation of silhouettes during walking. Two-dimensional principal component analysis (2DPCA is used to extract features from the AGDI. Experiments on CASIA dataset show that AGDI has better identification and verification performance than GEI. Comparing to PCA, 2DPCA is a more efficient and less memory storage consumption feature extraction method in gait based recognition.

  17. Chaotic maps and biometrics-based anonymous three-party authenticated key exchange protocol without using passwords

    International Nuclear Information System (INIS)

    Xie Qi; Hu Bin; Chen Ke-Fei; Liu Wen-Hao; Tan Xiao

    2015-01-01

    In three-party password authenticated key exchange (AKE) protocol, since two users use their passwords to establish a secure session key over an insecure communication channel with the help of the trusted server, such a protocol may suffer the password guessing attacks and the server has to maintain the password table. To eliminate the shortages of password-based AKE protocol, very recently, according to chaotic maps, Lee et al. [2015 Nonlinear Dyn. 79 2485] proposed a first three-party-authenticated key exchange scheme without using passwords, and claimed its security by providing a well-organized BAN logic test. Unfortunately, their protocol cannot resist impersonation attack, which is demonstrated in the present paper. To overcome their security weakness, by using chaotic maps, we propose a biometrics-based anonymous three-party AKE protocol with the same advantages. Further, we use the pi calculus-based formal verification tool ProVerif to show that our AKE protocol achieves authentication, security and anonymity, and an acceptable efficiency. (paper)

  18. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  19. Verification of MENDL2 and IEAF-2001 Data bases at intermediate energies

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Y. E. (Yury E.); Batyaev, V. F. (Vyacheslav F.); Karpikhin, E. I. (Evgeny I.); Zhivun, V. M. (Valery M.); Koldobsky, A. B. (Aleksander B.); Mulambetov, R. D. (Ruslan D.); Mulambetova, S. V.; Trebukhovsky, Y. V. (Yury V.); Zaitsev, S. L.; Lipatov, K. A.; Mashnik, S. G. (Stepan G.); Prael, R. E. (Richard E.)

    2004-01-01

    The work presents results on computer simulations of two experiments whose aim was measuring the threshold activation reaction rates in {sup 12}C, {sup 19}F, {sup 27}Al, {sup 59}Co, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 93}Nb, {sup 115}In, {sup 169}Tm, {sup 181}Ta, {sup 197}Au, and {sup 209}Bi thin samples placed inside and outside a 0.8-GeV proton-irradiated 4-cm thick W target and a 92-cm thick W-Na composite target of 15-cm diameter both. In total, more than 1000 values of activation reaction rates were determined in both experiments. The measured data were compared with results by the LAHET code using several nuclear data bases for the respective excitation functions, namely, ENDF/B6 for cross section of neutrons at energies below 20 MeV and MENDL2 together with MENDL2P for cross sections of protons and neutrons of 20 to 100 MeV energies. The recently developed IEAF-2001 data base that provides neutron cross sections up to 150 MeV was used as well. Simulation-to-experiment results obtained using MENDL2 and IEAF-2001 are presented. The agreement between simulation and experiment was found satisfactory for both data bases. Nevertheless; further studies should be conducted to improve simulations of the production of secondary protons and high-energy neutrons, as well as the high-energy neutron elastic scattering. Our results allow drawing some conclusions concerning the reliability of the transport codes and data bases used to simulate Accelerator Driven Systems (ADS), particularly with Na-cooled W targets. The high-energy threshold excitation functions to be used in activation-based unfolding of neutron spectra inside the ADS can be also inferred from our results.

  20. Fluence complexity for IMRT field and simplification of IMRT verification

    International Nuclear Information System (INIS)

    Hanushova, Tereza; Vondarchek, Vladimir

    2013-01-01

    Intensity Modulated Radiation Therapy (IMRT) requires dosimetric verification of each patient’s plan, which is time consuming. This work deals with the idea of minimizing the number of fields for control, or even replacing plan verification by machine quality assurance (QA). We propose methods for estimation of fluence complexity in an IMRT field based on dose gradients and investigate the relation between results of gamma analysis and this quantity. If there is a relation, it might be possible to only verify the most complex field of a plan. We determine the average fluence complexity in clinical fields and design a test fluence corresponding to this amount of complexity which might be used in daily QA and potentially replace patient-related verification. Its applicability is assessed in clinical practice. The relation between fluence complexity and results of gamma analysis has been confirmed for plans but not for single fields. There is an agreement between the suggested test fluence and clinical fields in the average gamma parameter. A critical value of average gamma has been specified for the test fluence as a criterion for distinguishing between poorly and well deliverable plans. It will not be possible to only verify the most complex field of a plan but verification of individual plans could be replaced by a morning check of the suggested test fluence, together with a well-established set of QA tests. (Author)

  1. Methods for Reachability-based Hybrid Controller Design

    Science.gov (United States)

    2012-05-10

    approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135

  2. Preliminary Validation and Verification Plan for CAREM Reactor Protection System; Modelo de Plan Preliminar de Validacion y Verificacion para el Sistema de Proteccion del Reactor CAREM

    Energy Technology Data Exchange (ETDEWEB)

    Fittipaldi, Ana; Felix, Maciel [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)

    2000-07-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan.

  3. Development of FPGA-based safety-related I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Goto, Y.; Oda, N.; Miyazaki, T.; Hayashi, T.; Sato, T.; Igawa, S. [08, Shinsugita-cho, Isogo-ku, Yokohama 235-8523 (Japan); 1, Toshiba-cho, Fuchu, Tokyo 183-8511 (Japan)

    2006-07-01

    Toshiba has developed Non-rewritable (NRW) Field Programmable Gate Array (FPGA)-based safety-related Instrumentation and Control (I and C) system [1]. Considering application to safety-related systems, nonvolatile and non-rewritable FPGA which is impossible to be changed after once manufactured has been adopted in Toshiba FPGA-based system. FPGA is a device which consists only of defined digital circuit: hardware, which performs defined processing. FPGA-based system solves issues existing both in the conventional systems operated by analog circuits (analog-based system) and the systems operated by central processing unit (CPU-based system). The advantages of applying FPGA are to keep the long-life supply of products, improving testability (verification), and to reduce the drift which may occur in analog-based system. The system which Toshiba developed this time is Power Range Monitor (PRM). Toshiba is planning to expand application of FPGA-based technology by adopting this development method to the other safety-related systems from now on. (authors)

  4. Abstract Interpretation-based verification/certification in the ciaoPP system

    OpenAIRE

    Puebla Sánchez, Alvaro Germán; Albert Albiol, Elvira; Hermenegildo, Manuel V.

    2005-01-01

    CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource...

  5. Automatic quality verification of the TV sets

    Science.gov (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag

    2010-01-01

    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  6. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  7. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    Science.gov (United States)

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  8. Research and Implementation of Automatic Fuzzy Garage Parking System Based on FPGA

    OpenAIRE

    Wang Kaiyu; Yu Zongmin; Guan Sanghai; Yang Xing; Sheng Menglin; Tang Zhenan

    2016-01-01

    Because of many common scenes of reverse parking in real life, this paper presents a fuzzy controller which accommodates front and back adjustment of vehicle’s body attitude, and based on chaotic-genetic arithmetic to optimize the membership function of this controller, and get a vertical parking fuzzy controller whose simulation result is good .The paper makes the hardware-software embedded design for system based on Field-Programmable Gate Array (FPGA), and set up a 1:10 verification platfo...

  9. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  10. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  11. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  12. Coast-down model based on rated parameters of reactor coolant pump

    International Nuclear Information System (INIS)

    Jiang Maohua; Zou Zhichao; Wang Pengfei; Ruan Xiaodong

    2014-01-01

    For a sudden loss of power in reactor coolant pump (RCP), a calculation model of rotor speed and flow characteristics based on rated parameters was studied. The derived model was verified by comparing with the power-off experimental data of 100D RCP. The results indicate that it can be used in preliminary design calculation and verification analysis. Then a design criterion of RCP was described based on the calculation model. The moment of inertia in AP1000 RCP was verified by this criterion. (authors)

  13. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  14. Life Cycle V and V Process for Hardware Description Language Programs of Programmable Logic Device-based Instrumentation and Control Systems

    International Nuclear Information System (INIS)

    Cha, K. H.; Lee, D. Y.

    2010-01-01

    Programmable Logic Device (PLD), especially Complex PLD (CPLD) or Field Programmable Logic Array (FPGA), has been growing in interest in nuclear Instrumentation and Control (I and C) applications. PLD has been applied to replace an obsolete analog device or old-fashioned microprocessor, or to develop digital controller, subsystem or overall system on hardware aspects. This is the main reason why the PLD-based I and C design provides higher flexibility than the analog-based one, and the PLD-based I and C systems shows better real-time performance than the processor-based I and C systems. Due to the development of the PLD-based I and C systems, their nuclear qualification has been issued in the nuclear industry. Verification and Validation (V and V) is one of necessary qualification activities when a Hardware Description Language (HDL) is used to implement functions of the PLD-based I and C systems. The life cycle V and V process, described in this paper, has been defined as satisfying the nuclear V and V requirements, and it has been applied to verify Correctness, Completeness, and Consistency (3C) among design outputs in a safety-grade programmable logic controller and a safety-critical data communication system. Especially, software engineering techniques such as the Fagan Inspection, formal verification, simulated verification and automated testing have been defined for the life cycle V and V tasks of behavioral, structural, and physical design in VHDL

  15. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  16. BROSMAP: A Novel Broadcast Based Secure Mobile Agent Protocol for Distributed Service Applications

    Directory of Open Access Journals (Sweden)

    Dina Shehada

    2017-01-01

    Full Text Available Mobile agents are smart programs that migrate from one platform to another to perform the user task. Mobile agents offer flexibility and performance enhancements to systems and service real-time applications. However, security in mobile agent systems is a great concern. In this paper, we propose a novel Broadcast based Secure Mobile Agent Protocol (BROSMAP for distributed service applications that provides mutual authentication, authorization, accountability, nonrepudiation, integrity, and confidentiality. The proposed system also provides protection from man in the middle, replay, repudiation, and modification attacks. We proved the efficiency of the proposed protocol through formal verification with Scyther verification tool.

  17. Lower and Upper Bounds in Zone Based Abstractions of Timed Automata

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Bouyer, Patricia; Larsen, Kim Guldstrand

    2004-01-01

    Timed automata have an infinite semantics. For verification purposes, one usually uses zone based abstractions w.r.t. the maximal constants to which clocks of the timed automaton are compared. We show that by distinguishing maximal lower and upper bounds, significantly coarser abstractions can...... dramatically increases the scalability of the real-time model checker Uppaal....

  18. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  19. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin, F.-F.; Gao, Q.H.; Xie, H.; Nelson, D.F.; Yu, Y.; Kwok, W.E.; Totterman, S.; Schell, M.C.; Rubin, P.

    1996-01-01

    Purpose/Objective: Although MR images have been extensively used for the treatment planning of radiation therapy of cancers, especially for brain cancers, they are not effectively used for the portal verification due to lack of bone/air information in MR images and geometric distortions. Typically, MR images are utilized through correlation with CT images, and this procedure is usually very labor and time consuming. For many brain cancer patients to be treated using conventional external beam radiation, MR images with proper distortion correction provide sufficient information for treatment planning and dose calculation, and a projection images may be generated for each specific treatment port and to be used as a reference image for treatment verification. The question is how to transfer anatomical features in MR images to the projection image as landmarks which could be correlated automatically to those in the portal image. The goal of this study is to generate digitally reconstructed projection images from MR brain images with some important anatomical features (brain contour, skull and gross tumor) as well as their relative locations to be used as references for the development of computerized portal verification scheme. Materials/Methods: Compared to conventional digital reconstructed radiograph from CT images, generation of digitally reconstructed projection images from MR images is heavily involved with pixel manipulation of MR images to correlate information from two types of images (MR, portal x-ray images) which are produced based on totally different imaging principles. Initially a wavelet based multi-resolution adaptive thresholding method is used to segment the skull slice-by-slice in MR brain axial images, and identified skull pixels are re-assigned to relatively higher intensities so that projection images will have comparable grey-level information as that in typical brain portal images. Both T1- and T2-weighted images are utilized to eliminate fat

  20. A web-based data-querying tool based on ontology-driven methodology and flowchart-based model.

    Science.gov (United States)

    Ping, Xiao-Ou; Chung, Yufang; Tseng, Yi-Ju; Liang, Ja-Der; Yang, Pei-Ming; Huang, Guan-Tarn; Lai, Feipei

    2013-10-08

    ," and "treatments for liver cancer") was 100% for all four experiments (10 patients, 100 patients, 1000 patients, and 10,000 patients). Among the three measured query phases, (1) structured query language operations, (2) criteria verification, and (3) other, the first two had the longest execution time. The ontology-driven FBDQM-based approach enriched the capabilities of the data-querying system. The adoption of the GLIF3.5 increased the potential for interoperability, shareability, and reusability of the query tasks.