WorldWideScience

Sample records for system verification

  1. Verification Account Management System (VAMS)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  2. Distorted Fingerprint Verification System

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  3. Experimental inventory verification system

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  4. Standard Verification System (SVS)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  5. Formal Verification of Continuous Systems

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  6. Biometric Technologies and Verification Systems

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  7. Enumeration Verification System (EVS)

    Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...

  8. Vehicle usage verification system

    Scanlon, W.G.; McQuiston, Jonathan; Cotton, Simon L.

    2012-01-01

    EN)A computer-implemented system for verifying vehicle usage comprising a server capable of communication with a plurality of clients across a communications network. Each client is provided in a respective vehicle and with a respective global positioning system (GPS) by which the client can

  9. Central Verification System

    US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...

  10. Validation of Embedded System Verification Models

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  11. Cognitive Bias in Systems Verification

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  12. Verification and Performance Analysis for Embedded Systems

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  13. Safety Verification for Probabilistic Hybrid Systems

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  14. On Verification Modelling of Embedded Systems

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  15. HDM/PASCAL Verification System User's Manual

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  16. Formal verification of algorithms for critical systems

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  17. Verification and Validation in Systems Engineering

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  18. Verification and Examination Management of Complex Systems

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  19. Packaged low-level waste verification system

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  20. Programmable electronic system design & verification utilizing DFM

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  1. Comparing formal verification approaches of interlocking systems

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  2. Formal Verification of Circuits and Systems

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  3. Model Checking - Automated Verification of Computational Systems

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  4. Formal Verification of Quasi-Synchronous Systems

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  5. TWRS system drawings and field verification

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  6. System Description: Embedding Verification into Microsoft Excel

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  7. Survey on Offline Finger Print Verification System

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  8. Packaged low-level waste verification system

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  9. Formal verification of industrial control systems

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  10. Systems Approach to Arms Control Verification

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  11. Automated Formal Verification for PLC Control Systems

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  12. Safety Verification for Probabilistic Hybrid Systems

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  13. Burnup verification using the FORK measurement system

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  14. Parametric Verification of Weighted Systems

    Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders

    2015-01-01

    are themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose...... a global update function that yields an assignment to each node in a PDG. For an iterative application of the function, we prove that a fixed point assignment to PDG nodes exists and the set of assignments constitutes a well-quasi ordering, thus ensuring that the fixed point assignment can be found after...

  15. Formal System Verification - Extension 2

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  16. Specification and Verification of Hybrid System

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  17. CTBT integrated verification system evaluation model supplement

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  18. CTBT integrated verification system evaluation model supplement

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  19. Palmprint Based Verification System Using SURF Features

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  20. Verification and validation of control system software

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  1. Temporal Specification and Verification of Real-Time Systems.

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  2. A Synthesized Framework for Formal Verification of Computing Systems

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  3. CTBT Integrated Verification System Evaluation Model

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  4. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  5. Compositional verification of real-time systems using Ecdar

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  6. Entropy Measurement for Biometric Verification Systems.

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  7. Formal Development and Verification of a Distributed Railway Control System

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  8. Portable system for periodical verification of area monitors for neutrons

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  9. Formal Development and Verification of a Distributed Railway Control System

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  10. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  11. Standard Verification System Lite (SVS Lite)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  12. Verification and Validation Issues in Systems of Systems

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  13. NPP Temelin instrumentation and control system upgrade and verification

    Ubra, O.; Petrlik, J.

    1998-01-01

    Two units of Ver 1000 type of the Czech nuclear power plant Temelin, which are under construction are being upgraded with the latest instrumentation and control system delivered by WEC. To confirm that the functional design of the new Reactor Control and Limitation System, Turbine Control System and Plant Control System are in compliance with the Czech customer requirements and that these requirements are compatible with NPP Temelin upgraded technology, the verification of the control systems has been performed. The method of transient analysis has been applied. Some details of the NPP Temelin Reactor Control and Limitation System verification are presented.(author)

  14. Dense time discretization technique for verification of real time systems

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  15. IDEF method for designing seismic information system in CTBT verification

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  16. Logic verification system for power plant sequence diagrams

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  17. Formal development and verification of a distributed railway control system

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  18. A hand held photo identity verification system for mobile applications

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  19. Verification and validation of computer based systems for PFBR

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  20. Parallel verification of dynamic systems with rich configurations

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  1. Formal Verification of Real-Time System Requirements

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  2. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  3. Verification and Validation for Flight-Critical Systems (VVFCS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  4. Formal verification of automated teller machine systems using SPIN

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  5. Software verification in on-line systems

    Ehrenberger, W.

    1980-01-01

    Operator assistance is more and more provided by computers. Computers contain programs, whose quality should be above a certain level, before they are allowed to be used in reactor control rooms. Several possibilities for gaining software reliability figures are discussed in this paper. By supervising the testing procedure of a program, one can estimate the number of remaining programming errors. Such an estimation, however, is not very accurate. With mathematical proving procedures one can gain some knowledge on program properties. Such proving procedures are important for the verification of general WHILE-loops, which tend to be error prone. The program analysis decomposes a program into its parts. First the program structure is made visible, which includes the data movements and the control flow. From this analysis test cases can be derived that lead to a complete test. Program analysis can be done by hand or automatically. A statistical program test normally requires a large number of test runs. This number is diminished if details concerning both the program to be tested or its use are known in advance. (orig.)

  6. Android-Based Verification System for Banknotes

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  7. Towards Verification of Constituent Systems through Automated Proof

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used...... to specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  8. Alien Registration Number Verification via the U.S. Citizenship and Immigration Service's Systematic Alien Verification for Entitlements System

    Ainslie, Frances M; Buck, Kelly R

    2008-01-01

    The purpose of this study was to evaluate the implications of conducting high-volume automated checks of the United States Citizenship and Immigration Services Systematic Allen Verification for Entitlements System (SAVE...

  9. Verification and validation guidelines for high integrity systems. Volume 1

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  10. Verification and validation guidelines for high integrity systems. Volume 1

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  11. MESA: Message-Based System Analysis Using Runtime Verification

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  12. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  13. Automatic Verification of Railway Interlocking Systems: A Case Study

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  14. Expert system verification and validation for nuclear power industry applications

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  15. The verification of neutron activation analysis support system (cooperative research)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  16. Modelling and Verification of Relay Interlocking Systems

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour...

  17. Compositional Verification of Multi-Station Interlocking Systems

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

  18. Investigation of novel spent fuel verification system for safeguard application

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source.

  19. Investigation of novel spent fuel verification system for safeguard application

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source

  20. Temporal logic runtime verification of dynamic systems

    Seotsanyana, M

    2010-07-01

    Full Text Available , this paper provides a novel framework that automatically and verifiably monitors these systems at runtime. The main aim of the framework is to assist the operator through witnesses and counterexamples that are generated during the execution of the system...

  1. Formal modelling and verification of interlocking systems featuring sequential release

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  2. Systems analysis-independent analysis and verification

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  3. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  4. Verification on reliability of heat exchanger for primary cooling system

    Koike, Sumio; Gorai, Shigeru; Onoue, Ryuji; Ohtsuka, Kaoru

    2010-07-01

    Prior to the JMTR refurbishment, verification on reliability of the heat exchangers for primary cooling system was carried out to investigate an integrity of continuously use component. From a result of the significant corrosion, decrease of tube thickness, crack were not observed on the heat exchangers, and integrity of heat exchangers were confirmed. In the long terms usage of the heat exchangers, the maintenance based on periodical inspection and a long-term maintenance plan is scheduled. (author)

  5. System Identification and Verification of Rotorcraft UAVs

    Carlton, Zachary M.

    The task of a controls engineer is to design and implement control logic. To complete this task, it helps tremendously to have an accurate model of the system to be controlled. Obtaining a very accurate system model is not a trivial one, as much time and money is usually associated with the development of such a model. A typical physics based approach can require hundreds of hours of flight time. In an iterative process the model is tuned in such a way that it accurately models the physical system's response. This process becomes even more complicated for unstable and highly non-linear systems such as the dynamics of rotorcraft. An alternate approach to solving this problem is to extract an accurate model by analyzing the frequency response of the system. This process involves recording the system's responses for a frequency range of input excitations. From this data, an accurate system model can then be deduced. Furthermore, it has been shown that with use of the software package CIFER® (Comprehensive Identification from FrEquency Responses), this process can both greatly reduce the cost of modeling a dynamic system and produce very accurate results. The topic of this thesis is to apply CIFER® to a quadcopter to extract a system model for the flight condition of hover. The quadcopter itself is comprised of off-the-shelf components with a Pixhack flight controller board running open source Ardupilot controller logic. In this thesis, both the closed and open loop systems are identified. The model is next compared to dissimilar flight data and verified in the time domain. Additionally, the ESC (Electronic Speed Controller) motor/rotor subsystem, which is comprised of all the vehicle's actuators, is also identified. This process required the development of a test bench environment, which included a GUI (Graphical User Interface), data pre and post processing, as well as the augmentation of the flight controller source code. This augmentation of code allowed for

  6. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  7. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  8. Functional verification of dynamically reconfigurable FPGA-based systems

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  9. Verification test of control rod system for HTR-10

    Zhou Huizhong; Diao Xingzhong; Huang Zhiyong; Cao Li; Yang Nianzu

    2002-01-01

    There are 10 sets of control rods and driving devices in 10 MW High Temperature Gas-cooled Test Reactor (HTR-10). The control rod system is the controlling and shutdown system of HTR-10, which is designed for reactor criticality, operation, and shutdown. In order to guarantee technical feasibility, a series of verification tests were performed, including room temperature test, thermal test, test after control rod system installed in HTR-10, and test of control rod system before HTR-10 first criticality. All the tests data showed that driving devices working well, control rods running smoothly up and down, random position settling well, and exactly position indicating

  10. Development and verification of Monte Carlo burnup calculation system

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  11. Systems analysis - independent analysis and verification

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  12. On the Symbolic Verification of Timed Systems

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif

    1999-01-01

    This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...

  13. Tracer verification and monitoring of containment systems

    Lowry, W.; Dunn, S.D.; Williams, C.

    1996-01-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential for movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Consequently, barriers are primarily intended for use in high-risk sites where few viable alternatives exist to stop the movement of contaminants in the near term. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Existing surface-based and borehole geophysical techniques do not provide the degree of resolution required to assure the formation of an integral in-situ barrier. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) are developing a quantitative subsurface barrier assessment system using gaseous tracers. Called SEAtrace trademark, this system integrates an autonomous, multipoint soil vapor sampling and analysis system with a global optimization modeling methodology to pinpoint leak sources and sizes in real time. SEAtrace trademark is applicable to impermeable barrier emplacements above the water table, providing a conservative assessment of barrier integrity after emplacement, as well as a long term integrity monitoring function. The SEAtrace trademark system is being developed under funding by the DOE-EM Subsurface Contaminant Focus Area

  14. Verification station for Sandia/Rockwell Plutonium Protection system

    Nicholson, N.; Hastings, R.D.; Henry, C.N.; Millegan, D.R.

    1979-04-01

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  15. Verification test report on a solar heating and hot water system

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  16. LHC Beam Loss Monitoring System Verification Applications

    Dehning, B; Zamantzas, C; Jackson, S

    2011-01-01

    The LHC Beam Loss Mon­i­tor­ing (BLM) sys­tem is one of the most com­plex in­stru­men­ta­tion sys­tems de­ployed in the LHC. In ad­di­tion to protecting the col­lid­er, the sys­tem also needs to pro­vide a means of di­ag­nos­ing ma­chine faults and de­liv­er a feed­back of loss­es to the control room as well as to sev­er­al sys­tems for their setup and analysis. It has to trans­mit and pro­cess sig­nals from al­most 4’000 mon­i­tors, and has near­ly 3 mil­lion con­fig­urable pa­ram­e­ters. The system was de­signed with re­li­a­bil­i­ty and avail­abil­i­ty in mind. The spec­i­fied op­er­a­tion and the fail-safe­ty stan­dards must be guar­an­teed for the sys­tem to per­form its func­tion in pre­vent­ing su­per­con­duc­tive mag­net de­struc­tion caused by par­ti­cle flux. Main­tain­ing the ex­pect­ed re­li­a­bil­i­ty re­quires ex­ten­sive test­ing and ver­i­fi­ca­tion. In this paper we re­port our most re­cent ad­di­t...

  17. Geometrical verification system using Adobe Photoshop in radiotherapy.

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  18. Internet-based dimensional verification system for reverse engineering processes

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  19. Internet-based dimensional verification system for reverse engineering processes

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  20. Expert system verification and validation survey. Delivery 3: Recommendations

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  1. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  2. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  3. Source Code Verification for Embedded Systems using Prolog

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  4. Rule Systems for Runtime Verification: A Short Tutorial

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  5. An evaluation of the management system verification pilot at Hanford

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  6. Development of requirements tracking and verification system for the software design of distributed control system

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  7. Development of requirements tracking and verification system for the software design of distributed control system

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  8. Image-based fingerprint verification system using LabVIEW

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  9. ECG based biometrics verification system using LabVIEW

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  10. FIR signature verification system characterizing dynamics of handwriting features

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  11. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  12. Specification and verification of the RTOS for plant protection systems

    Kim, Jin Hyun; Ahn, Young Ah; Lee, Su-Young; Choi, Jin Young; Lee, Na Young

    2004-01-01

    PLC is a computer system for instrumentation and control (I and C) systems such as control of machinery on factory assembly lines. control of machinery on factory assembly lines and Nucleare power plants. In nuclear power industry, systems is classified into 3 classes- Non-safety, safety-related and safety-critical up to integrity on system's using purpose. If PLC is used for controlling reactor in nuclear power plant, it should be identified as safety-critical. PLC has several I and C logics in software, including real-time operating system (RTOS). Hence, RTOS must be also proved that it is safe and reliable by various way and methods. In this paper, we apply formal methods to a development of RTOS for PLC in safety-critical level; Statecharts for specification and model checking for verification. In this paper, we give the results of applying formal methods to RTOS. (author)

  13. Verification of Opacity and Diagnosability for Pushdown Systems

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  14. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  15. Methods and practices for verification and validation of programmable systems

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  16. Standard practice for verification and classification of extensometer systems

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  17. A virtual linear accelerator for verification of treatment planning systems

    Wieslander, Elinore

    2000-01-01

    A virtual linear accelerator is implemented into a commercial pencil-beam-based treatment planning system (TPS) with the purpose of investigating the possibility of verifying the system using a Monte Carlo method. The characterization set for the TPS includes depth doses, profiles and output factors, which is generated by Monte Carlo simulations. The advantage of this method over conventional measurements is that variations in accelerator output are eliminated and more complicated geometries can be used to study the performance of a TPS. The difference between Monte Carlo simulated and TPS calculated profiles and depth doses in the characterization geometry is less than ±2% except for the build-up region. This is of the same order as previously reported results based on measurements. In an inhomogeneous, mediastinum-like case, the deviations between TPS and simulations are small in the unit-density regions. In low-density regions, the TPS overestimates the dose, and the overestimation increases with increasing energy from 3.5% for 6 MV to 9.5% for 18 MV. This result points out the widely known fact that the pencil beam concept does not handle changes in lateral electron transport, nor changes in scatter due to lateral inhomogeneities. It is concluded that verification of a pencil-beam-based TPS with a Monte Carlo based virtual accelerator is possible, which facilitates the verification procedure. (author)

  18. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  20. Electroacoustic verification of frequency modulation systems in cochlear implant users.

    Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari

    2017-12-26

    The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of

  1. European Train Control System: A Case Study in Formal Verification

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  2. Real-Time System Verification by Kappa-Induction

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  3. Technology verification phase. Dynamic isotope power system. Final report

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  4. Automated data acquisition and analysis system for inventory verification

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  5. Technology verification phase. Dynamic isotope power system. Final report

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  6. Application of verification and validation on safety parameter display systems

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  7. Verification and Validation of Embedded Knowledge-Based Software Systems

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  8. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-04] Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer... Following Information Title of Proposal: Enterprise Income Verification (EIV) System- Debts Owed to PHAs and...

  9. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  10. Dual-use benefits of the CTBT verification system

    Meade, C.E.F.

    1999-01-01

    Since it has been completed in September 1996, the CTBT has been signed by 151 countries. Awaiting the 44 ratifications and entry into force, all of the nuclear powers have imposed unilateral moratoriums on nuclear test explosions. The end of these weapons development activities is often cited as the principal benefit of the CTBT. As the world begins to implement the Treaty, it has become clear that the development and operation of the CTBT verification system will provide a wide range of additional benefits if the data analysis products are available for dual-purpose applications. As this paper describes these could have economic and social implications, especially for countries with limited technical infrastructures. These involve, seismic monitoring, mineral exploration, scientific and technical training

  11. A new approach for the verification of optical systems

    Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène

    2013-09-01

    Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Tra­ditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.

  12. Development of film dosimetric measurement system for verification of RTP

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  13. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  14. Research on key technology of the verification system of steel rule based on vision measurement

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  15. Design Development and Verification of a System Integrated Modular PWR

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  16. A GIS support system for declaration and verification

    Poucet, A.; Contini, S.; Bellezza, F.

    2001-01-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  17. A GIS support system for declaration and verification

    Poucet, A; Contini, S; Bellezza, F [European Commission, Joint Research Centre, Institute for Systems Informatics and Safety (ISIS), Ispra (Italy)

    2001-07-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  18. System design and verification process for LHC programmable trigger electronics

    Crosetto, D

    1999-01-01

    The rapid evolution of electronics has made it essential to design systems in a technology-independent form that will permit their realization in any future technology. This article describes two practical projects that have been developed for fast, programmable, scalable, modular electronics for the first-level trigger of Large Hadron Collider (LHC) experiments at CERN, Geneva. In both projects, one for the front-end electronics and the second for executing first- level trigger algorithms, the whole system requirements were constrained to two types of replicated components. The overall problem is described, the 3D-Flow design is introduced as a novel solution, and current solutions to the problem are described and compared with the 3D-Flow solution. The design/verification methodology proposed allows the user's real-time system algorithm to be verified down to the gate-level simulation on a technology- independent platform, thus yielding the design for a system that can be implemented with any technology at ...

  19. Verification and Validation of Flight-Critical Systems

    Brat, Guillaume

    2010-01-01

    For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.

  20. Tracer verification and monitoring of containment systems (II)

    Williams, C.V.; Dunn, S.D.; Lowry, W.E.

    1997-01-01

    A tracer verification and monitoring system, SEAtrace trademark, has been designed and field tested which uses gas tracers to evaluate, verify, and monitor the integrity of subsurface barriers. This is accomplished using an automatic, rugged, autonomous monitoring system combined with an inverse optimization code. A gaseous tracer is injected inside the barrier and an array of wells outside the barrier are monitored. When the tracer gas is detected, a global optimization code is used to calculate the leak parameters, including leak size, location, and when the leak began. The multipoint monitoring system operates in real-time, can be used to measure both the tracer gas and soil vapor contaminants, and is capable of unattended operation for long periods of time (months). The global optimization code searches multi-dimensional open-quotes spaceclose quotes to find the best fit for all of the input parameters. These parameters include tracer gas concentration histories from multiple monitoring points, medium properties, barrier location, and the source concentration. SEAtrace trademark does not attempt to model all of the nuances associated with multi-phase, multi-component flow, but rather, the inverse code uses a simplistic forward model which can provide results which are reasonably accurate. The system has calculated leak locations to within 0.5 meters and leak radii to within 0.12 meters

  1. Development of NSSS Control System Performance Verification Tool

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  2. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  3. Runtime verification of embedded real-time systems.

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  4. System verification and validation report for the TMAD code

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  5. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  6. Quantitative dosimetric verification of an IMRT planning and delivery system

    Low, D.A.; Mutic, S.; Dempsey, J.F.; Gerber, R.L.; Bosch, W.R.; Perez, C.A.; Purdy, J.A.

    1998-01-01

    Background and purpose: The accuracy of dose calculation and delivery of a commercial serial tomotherapy treatment planning and delivery system (Peacock, NOMOS Corporation) was experimentally determined. Materials and methods: External beam fluence distributions were optimized and delivered to test treatment plan target volumes, including three with cylindrical targets with diameters ranging from 2.0 to 6.2 cm and lengths of 0.9 through 4.8 cm, one using three cylindrical targets and two using C-shaped targets surrounding a critical structure, each with different dose distribution optimization criteria. Computer overlays of film-measured and calculated planar dose distributions were used to assess the dose calculation and delivery spatial accuracy. A 0.125 cm 3 ionization chamber was used to conduct absolute point dosimetry verification. Thermoluminescent dosimetry chips, a small-volume ionization chamber and radiochromic film were used as independent checks of the ion chamber measurements. Results: Spatial localization accuracy was found to be better than ±2.0 mm in the transverse axes (with one exception of 3.0 mm) and ±1.5 mm in the longitudinal axis. Dosimetric verification using single slice delivery versions of the plans showed that the relative dose distribution was accurate to ±2% within and outside the target volumes (in high dose and low dose gradient regions) with a mean and standard deviation for all points of -0.05% and 1.1%, respectively. The absolute dose per monitor unit was found to vary by ±3.5% of the mean value due to the lack of consideration for leakage radiation and the limited scattered radiation integration in the dose calculation algorithm. To deliver the prescribed dose, adjustment of the monitor units by the measured ratio would be required. Conclusions: The treatment planning and delivery system offered suitably accurate spatial registration and dose delivery of serial tomotherapy generated dose distributions. The quantitative dose

  7. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  8. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  9. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  10. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  11. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  12. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    Elmont, T.H.; Langner, Diana C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Modenov, A.

    2005-01-01

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  13. Monte Carlo systems used for treatment planning and dose verification

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte

  14. 75 FR 38765 - Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and...

    2010-07-06

    ..., facility assessment services, certifications of quantity and quality, import product inspections, and... control number. These include export certification, inspection of section 8e import products, and...] Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and Certification of...

  15. In pursuit of carbon accountability: the politics of REDD+ measuring, reporting and verification systems

    Gupta, A.; Lövbrand, E.; Turnhout, E.; Vijge, M.J.

    2012-01-01

    This article reviews critical social science analyses of carbonaccounting and monitoring, reporting and verification (MRV) systems associated with reducing emissions from deforestation, forest degradation and conservation, sustainable use and enhancement of forest carbon stocks (REDD+). REDD+ MRV

  16. Comparative Analysys of Speech Parameters for the Design of Speaker Verification Systems

    Souza, A

    2001-01-01

    Speaker verification systems are basically composed of three stages: feature extraction, feature processing and comparison of the modified features from speaker voice and from the voice that should be...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: NEW CONDENSATOR, INC.--THE CONDENSATOR DIESEL ENGINE RETROFIT CRANKCASE VENTILATION SYSTEM

    EPA's Environmental Technology Verification Program has tested New Condensator Inc.'s Condensator Diesel Engine Retrofit Crankcase Ventilation System. Brake specific fuel consumption (BSFC), the ratio of engine fuel consumption to the engine power output, was evaluated for engine...

  18. Results of verifications of the control automatic exposure in equipment of RX with CR systems

    Ruiz Manzano, P.; Rivas Ballarin, M. A.; Ortega Pardina, P.; Villa Gazulla, D.; Calvo Carrillo, S.; Canellas Anoz, M.; Millan Cebrian, E.

    2013-01-01

    After the entry into force in 2012, the new Spanish Radiology quality control protocol lists and discusses the results obtained after verification of the automatic control of exposure in computed radiography systems. (Author)

  19. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  20. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  1. Television system for verification and documentation of treatment fields during intraoperative radiation therapy

    Fraass, B.A.; Harrington, F.S.; Kinsella, T.J.; Sindelar, W.F.

    1983-01-01

    Intraoperative radiation therapy (IORT) involves direct treatment of tumors or tumor beds with large single doses of radiation. The verification of the area to be treated before irradiation and the documentation of the treated area are critical for IORT, just as for other types of radiation therapy. A television system which allows the target area to be directly imaged immediately before irradiation has been developed. Verification and documentation of treatment fields has made the IORT television system indispensable

  2. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  3. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  4. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  5. Verification and synthesis of optimal decision strategies for complex systems

    Summers, S. J.

    2013-07-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  6. Verification and synthesis of optimal decision strategies for complex systems

    Summers, S. J.

    2013-01-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  7. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  8. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  9. Compositional Verification of Interlocking Systems for Large Stations

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  10. A knowledge-base verification of NPP expert systems using extended Petri nets

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  11. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  12. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  13. Development of a tool for knowledge base verification of expert system based on Design/CPN

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  14. Data-driven property verification of grey-box systems by Bayesian experiment design

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  15. REQUIREMENT VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM

    2017-09-01

    VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM by Theresa L. Thomas September... ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM 5. FUNDING NUMBERS 6. AUTHOR(S) Theresa L. Thomas 7...CODE 13. ABSTRACT (maximum 200 words) The Naval Air Systems Command (NAVAIR) systems engineering technical review (SETR) process does not

  16. Ongoing Work on Automated Verification of Noisy Nonlinear Systems with Ariadne

    Geretti, Luca; Bresolin, Davide; Collins, Pieter; Zivanovic Gonzalez, Sanja; Villa, Tiziano

    2017-01-01

    Cyber-physical systems (CPS) are hybrid systems that commonly consist of a discrete control part that operates in a continuous environment. Hybrid automata are a convenient model for CPS suitable for formal verification. The latter is based on reachability analysis of the system to trace its hybrid

  17. Digital system verification a combined formal methods and simulation framework

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  18. Cooling Tower (Evaporative Cooling System) Measurement and Verification Protocol

    Kurnik, Charles W. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Boyd, Brian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lewis, Taylor [Colorado Energy Office, Denver, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with cooling tower efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  19. Formal Development and Verification of Railway Control Systems

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    done applying conventional methods where requirements and designs are described using natural language, diagrams and pseudo code, and the verification of requirements has been done by code inspection and non-exhaustive testing. These techniques are not sufficient, leading to errors and an in-effective...... for Strategic Research. The work is affiliated with a number of partners: DTU Compute, DTU Transport, DTU Management, DTU Fotonik, Bremen University, Banedanmark, Trafikstyrelsen, DSB, and DSB S-tog. More information about RobustRails project is available at http://www.dtu.dk/subsites/robustrails/English.aspx...

  20. Automated Image Acquisition System for the Verification of Copper-Brass Seal Images

    Stringa, E.; Bergonzi, C.; Littmann, F.; ); Marszalek, Y.; Tempesta, S.; )

    2015-01-01

    This paper describes a system for the verification of copper-brass seals realized by JRC according to DG ENER requirements. DG ENER processes about 20,000 metal seals per year. The verification of metal seals consists in visually checking the identity of a removed seal. The identity of a copper-brass seal is defined by a random stain pattern realized by the seal producer together with random scratches engraved when the seals are initialized ('seal production'). In order to verify that the seal returned from the field is the expected one its pattern is compared with an image taken during seal production. Formerly, seal initialization and verification were very heavy tasks as seal pictures were acquired with a camera one by one both in the initialization and verification stages. During the initialization the Nuclear Safeguards technicians had to place one by one new seals under a camera and acquire the related reference images. During the verification, the technician had to take used seals and place them one by one under a camera to take new pictures. The new images were presented to the technicians without any preprocessing and the technicians had to recognize the seal. The new station described in this paper has an automated image acquisition system allowing to easily process seals in batches of 100 seals. To simplify the verification, a software automatically centres and rotates the newly acquired seal image in order to perfectly overlap with the reference image acquired during the production phase. The new system significantly speeds up seal production and helps particularly with the demanding task of seal verification. As a large part of the seals is dealt with by a joint Euratom-IAEA team, the IAEA directly profits from this development. The new tool has been in routine use since mid 2013. (author)

  1. The inverse method parametric verification of real-time embedded systems

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  2. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  3. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  4. TLM.open: a SystemC/TLM Frontend for the CADP Verification Toolbox

    Claude Helmstetter

    2014-04-01

    Full Text Available SystemC/TLM models, which are C++ programs, allow the simulation of embedded software before hardware low-level descriptions are available and are used as golden models for hardware verification. The verification of the SystemC/TLM models is an important issue since an error in the model can mislead the system designers or reveal an error in the specifications. An open-source simulator for SystemC/TLM is provided but there are no tools for formal verification.In order to apply model checking to a SystemC/TLM model, a semantics for standard C++ code and for specific SystemC/TLM features must be provided. The usual approach relies on the translation of the SystemC/TLM code into a formal language for which a model checker is available.We propose another approach that suppresses the error-prone translation effort. Given a SystemC/TLM program, the transitions are obtained by executing the original code using g++ and an extended SystemC library, and we ask the user to provide additional functions to store the current model state. These additional functions generally represent less than 20% of the size of the original model, and allow it to apply all CADP verification tools to the SystemC/TLM model itself.

  5. A Cache System Design for CMPs with Built-In Coherence Verification

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  6. A Formal Approach for the Construction and Verification of Railway Control Systems

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed...

  7. An Improved Constraint-Based System for the Verification of Security Protocols

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  8. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  9. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  10. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  11. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  12. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  13. An Improved Constraint-based system for the verification of security protocols

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  14. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  15. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  16. Formal verification and validation of the safety-critical software in a digital reactor protection system

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  17. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  18. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements

    Budnikov, D.; Bulatov, M.; Jarikhine, I.; Lebedev, B.; Livke, A.; Modenov, A.; Morkin, A.; Razinkov, S.; Tsaregorodtsev, D.; Vlokh, A.; Yakovleva, S.; Elmont, T.H.; Langner, D.C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Luke, S.J.

    2005-01-01

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPEC PLUS . The neutron multiplicity counter is a three ring counter with 164 3 He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  19. Verification of FPGA-based NPP I and C systems. General approach and techniques

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  1. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  2. Selection and verification of safety parameters in safety parameter display system for nuclear power plants

    Zhang Yuangfang

    1992-02-01

    The method and results for safety parameter selection and its verification in safety parameter display system of nuclear power plants are introduced. According to safety analysis, the overall safety is divided into six critical safety functions, and a certain amount of safety parameters which can represent the integrity degree of each function and the causes of change are strictly selected. The verification of safety parameter selection is carried out from the view of applying the plant emergency procedures and in the accident man oeuvres on a full scale nuclear power plant simulator

  3. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  4. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  5. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-05] Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... Access, Authorization Form and Rules Of Behavior and User Agreement. OMB Approval Number: 2577-New. Form...

  6. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  7. Verification tests for remote controlled inspection system in nuclear power plants

    Kohno, Tadaaki

    1986-01-01

    Following the increase of nuclear power plants, the total radiation exposure dose accompanying inspection and maintenance works tended to increase. Japan Power Engineering and Inspection Corp. carried out the verification test of a practical power reactor automatic inspection system from November, 1981, to March, 1986, and in this report, the state of having carried out this verification test is described. The objects of the verification test were the equipment which is urgently required for reducing radiation exposure dose, the possibility of realization of which is high, and which is important for ensuring the safety and reliability of plants, that is, an automatic ultrasonic flaw detector for the welded parts of bend pipes, an automatic disassembling and inspection system for control rod driving mechanism, a fuel automatic inspection system, and automatic decontaminating equipments for steam generator water chambers, primary system crud and radioactive gas in coolant. The results of the verification test of these equipments were judged as satisfactory, therefore, the application to actual plants is possible. (Kako, I.)

  8. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  9. Preface of Special issue on Automated Verification of Critical Systems (AVoCS'14)

    Huisman, Marieke; van de Pol, Jaco

    2016-01-01

    AVoCS 2014, the 14th International Conference on Automated Verification of Critical Systems has been hosted by the University of Twente, and has taken place in Enschede, Netherlands, on 24–26 September, 2014. The aim of the AVoCS series is to contribute to the interaction and exchange of ideas among

  10. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    Kwon, I. W.; Seong, P. H.

    1996-01-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  11. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  12. A new verification film system for routine quality control of radiation fields: Kodak EC-L.

    Hermann, A; Bratengeier, K; Priske, A; Flentje, M

    2000-06-01

    The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.

  13. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  14. Safety verification of non-linear hybrid systems is quasi-decidable

    Ratschan, Stefan

    2014-01-01

    Roč. 44, č. 1 (2014), s. 71-90 ISSN 0925-9856 R&D Projects: GA ČR GCP202/12/J060 Institutional support: RVO:67985807 Keywords : hybrid system s * safety verification * decidability * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 0.875, year: 2014

  15. Formal modeling and verification of systems with self-x properties

    Reif, Wolfgang

    2006-01-01

    Formal modeling and verification of systems with self-x properties / Matthias Güdemann, Frank Ortmeier and Wolfgang Reif. - In: Autonomic and trusted computing : third international conference, ATC 2006, Wuhan, China, September 3-6, 2006 ; proceedings / Laurence T. Yang ... (eds.). - Berlin [u.a.] : Springer, 2006. - S. 38-47. - (Lecture notes in computer science ; 4158)

  16. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  17. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  18. Application of plutonium inventory measurement system (PIMS) and temporary canister verification system (TCVS) at RRP

    Noguchi, Yoshihiko; Nakamura, Hironobu; Adachi, Hideto; Iwamoto, Tomonori

    2004-01-01

    In U-Pu co-denitration area at Rokkasho Reprocessing Plant (RRP), Plutonium Inventory Measurement System (PIMS) and Temporary Canister Verification System (TCVS) are installed to provide efficient and effective safeguards. PIMS measures Pu quantity inside pipes and vessels installed in glove boxes by total neutron counting method. PIMS consists of total 142 neutron detector attached on the wall and top of glove boxes and neutron count rates of each detectors are related to each other to calculate Pu quantity of each process areas. In this moment, inactive calibration using Cf-source was completed. On the other hand, TCVS measures Pu quantity of canisters inside temporary storage by coincidence counting method and it will be installed before the active test. These systems have monitoring function as additional measures. This paper describes specification, performance and measurement principles of PIMS and TCVS. (author)

  19. Verification of the safety communication protocol in train control system using colored Petri net

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  20. Verification and disarmament

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. Verification and disarmament

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  2. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  3. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  4. Simulation-based design process for the verification of ITER remote handling systems

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  5. Scenario-based verification of real-time systems using UPPAAL

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  6. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  7. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  8. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  9. Modeling and Verification of Dependable Electronic Power System Architecture

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  10. Renewable Energy Certificate (REC) Tracking Systems: Costs & Verification Issues (Presentation)

    Heeter, J.

    2013-10-01

    This document provides information on REC tracking systems: how they are used in the voluntary REC market, a comparison of REC systems fees and information regarding how they treat environmental attributes.

  11. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality,

  12. Abstractions for Fault-Tolerant Distributed System Verification

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  13. Rigorous Verification for the Solution of Nonlinear Interval System ...

    We survey a general method for solving nonlinear interval systems of equations. In particular, we paid special attention to the computational aspects of linear interval systems since the bulk of computations are done during the stage of computing outer estimation of the including linear interval systems. The height of our ...

  14. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  15. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    Haapanen, P.

    1995-01-01

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  16. Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection

    Muhammad Subekti

    2009-01-01

    Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection. The present research was done for verification of previous developed method on Loss of Coolant Accident (LOCA) detection and perform simulations for knowing the sensitivity of the PWR monitoring system that applied neuro-expert method. The previous research continuing on present research, has developed and has tested the neuro-expert method for several anomaly detections in Nuclear Power Plant (NPP) typed Pressurized Water Reactor (PWR). Neuro-expert can detect the LOCA anomaly with sensitivity of primary coolant leakage of 7 gallon/min and the conventional method could not detect the primary coolant leakage of 30 gallon/min. Neuro expert method detects significantly LOCA anomaly faster than conventional system in Surry-1 NPP as well so that the impact risk is reducible. (author)

  17. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  18. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  19. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  20. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  1. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  2. Environmental Technology Verification: Biological Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Systems--American Ultraviolet Corporation, DC24-6-120 [EPA600etv08005

    The Air Pollution Control Technology Verification Center (APCT Center) is operated by RTI International (RTI), in cooperation with EPA's National Risk Management Research Laboratory. The APCT Center conducts verifications of technologies that clean air in ventilation systems, inc...

  3. Active Learning of Markov Decision Processes for System Verification

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    deterministic Markov decision processes from data by actively guiding the selection of input actions. The algorithm is empirically analyzed by learning system models of slot machines, and it is demonstrated that the proposed active learning procedure can significantly reduce the amount of data required...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...... of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  4. International exchange on nuclear safety related expert systems: The role of software verification and validation

    Sun, B.K.H.

    1996-01-01

    An important lesson learned from the Three Mile Island accident is that human errors can be significant contributors to risk. Recent advancement in computer hardware and software technology helped make expert system techniques potentially viable tools for improving nuclear power plant safety and reliability. As part of the general man-machine interface technology, expert systems have recently become increasingly prominent as a potential solution to a number of previously intractable problems in many phases of human activity, including operation, maintenance, and engineering functions. Traditional methods for testing and analyzing analog systems are no longer adequate to handle the increased complexity of software systems. The role of Verification and Validation (V and V) is to add rigor to the software development and maintenance cycle to guarantee the high level confidence needed for applications. Verification includes the process and techniques for confirming that all the software requirements in one stage of the development are met before proceeding on to the next stage. Validation involves testing the integrated software and hardware system to ensure that it reliably fulfills its intended functions. Only through a comprehensive V and V program can a high level of confidence be achieved. There exist many different standards and techniques for software verification and validation, yet they lack uniform approaches that provides adequate levels of practical guidance which can be used by users for nuclear power plant applications. There is a need to unify different approaches for addressing software verification and validation and to develop practical and cost effective guidelines for user and regulatory acceptance. (author). 8 refs

  5. MOVES - A tool for Modeling and Verification of Embedded Systems

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  6. Seismic monitoring: a unified system for research and verifications

    Thigpen, L.

    1979-01-01

    A system for characterizing either a seismic source or geologic media from observational data was developed. This resulted from an examination of the forward and inverse problems of seismology. The system integrates many seismic monitoring research efforts into a single computational capability. Its main advantage is that it unifies computational and research efforts in seismic monitoring. 173 references, 9 figures, 3 tables

  7. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  8. Models and formal verification of multiprocessor system-on-chips

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    present experimental results on rather small systems with high complexity, primarily due to differences between best-case and worst-case execution times. Considering worst-case execution times only, the system becomes deterministic and using a special version of {Uppaal}, where the no history is saved, we...

  9. Verification and validation of the safety parameter display system for nuclear power plant

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  10. Spaceport Command and Control System Automated Verification Software Development

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  11. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  12. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  13. A Verification and Validation Tool for Diagnostic Systems, Phase I

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  14. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  15. The application of coloured Petri nets to verification of distributed systems specified by message Sequence Charts

    CHERNENOK S.A.; NEPOMNIASCHY V.A.

    2015-01-01

    The language of message sequence charts (MSC) is a popular scenario-based specification language used to describe the interaction of components in distributed systems. However, the methods for validation of MSC diagrams are underdeveloped. This paper describes a method for translation of MSC diagrams into coloured Petri nets (CPN). The method is applied to the property verification of these diagrams. The considered set of diagram elements is extended by the elements of UML sequence diagrams a...

  16. Quantum Mechanics and locality in the K0 K-bar0 system experimental verification possibilities

    Muller, A.

    1994-11-01

    It is shown that elementary Quantum Mechanics, applied to the K 0 K-bar 0 system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K 0 K-bar 0 experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs

  17. Verification of Security Policy Enforcement in Enterprise Systems

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  18. Verification of Continuous Dynamical Systems by Timed Automata

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  19. Measurability and Safety Verification for Stochastic Hybrid Systems

    Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger

    2011-01-01

    method that establishes safe upper bounds on reachability probabilities. To arrive there requires us to solve semantic intricacies as well as practical problems. In particular, we show that measurability of a complete system follows from the measurability of its constituent parts. On the practical side......-time behaviour is given by differential equations, as for usual hybrid systems, but the targets of discrete jumps are chosen by probability distributions. These distributions may be general measures on state sets. Also non-determinism is supported, and the latter is exploited in an abstraction and evaluation...

  20. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  1. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  2. Specification styles in distributed systems design and verification

    Vissers, C.A.; Scollo, Giuseppe; van Sinderen, Marten J.; Brinksma, Hendrik

    1991-01-01

    Substantial experience with the use of formal specification languages in the design of distributed systems has shown that finding appropriate structures for formal specifications presents a serious, and often underestimated problem. Its solutions are of great importance for ensuring the quality of

  3. Verification of control system using inverter and canned motor pump

    Sawada, Yoshiaki; Misato, Hisashi

    2002-01-01

    Control on flow volume and so on of auxiliary systems at power stations is generally carried out by using control valves (CVs), of which numbers and kinds ranges to wide areas. CVs are required for periodical change of packing and so on, of which labor for maintenance is never few. Therefore, to reduce the maintenance of CVs, a system to operate pumps by using an inverter control was investigated. When carrying out flow control by an inverter, valves at output side of pumps was made perfectly open, but because of control on rotation numbers so as to keep required amount excess energy is never consumed. And, by reducing flow volume of a pump, consumed energy is reduced at a rate of its three powers as feature of pumps, so large energy saving effect can be established. Selected canned motor pumps have such characteristics as upgrading of reliability for leakage because of their seal-less ones and extension of periodical inspection period by setting a monitor for abrasion of bearings. As results of some investigations, it could be considered that a control system combining an inverter with a canned motor pump had equal feature as that of a control system using CVs. And, from a test result adding useless time and first order delay element to its control feature forecasting on its application to practical machine could be obtained. (G.K.)

  4. Model Verification and Validation Using Graphical Information Systems Tools

    2013-07-31

    Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface

  5. Development and verification of symptom based emergency procedure support system

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  6. Formal Verification of the Danish Railway Interlocking Systems

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    in the new Danish interlocking systems. Instantiating the generic model with interlocking configuration data results in a concrete model and high-level safety properties. Using bounded model checking and inductive reasoning, we are able to verify safety properties for model instances corresponding to railway...

  7. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  8. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  9. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  10. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    Liu, X; Yin, Y; Lin, X [Shandong Cancer Hospital and Institute, China, Jinan, Shandong (China)

    2016-06-15

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system. Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.

  11. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  12. A method of knowledge base verification and validation for nuclear power plants expert systems

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  13. Nondestructive verification and assay systems for spent fuels

    Cobb, D.D.; Phillips, J.R.; Bosler, G.E.; Eccleston, G.W.; Halbig, J.K.; Hatcher, C.R.; Hsue, S.T.

    1982-04-01

    This is an interim report of a study concerning the potential application of nondestructive measurements on irradiated light-water-reactor (LWR) fuels at spent-fuel storage facilities. It describes nondestructive measurement techniques and instruments that can provide useful data for more effective in-plant nuclear materials management, better safeguards and criticality safety, and more efficient storage of spent LWR fuel. In particular, several nondestructive measurement devices are already available so that utilities can implement new fuel-management and storage technologies for better use of existing spent-fuel storage capacity. The design of an engineered prototype in-plant spent-fuel measurement system is approx. 80% complete. This system would support improved spent-fuel storage and also efficient fissile recovery if spent-fuel reprocessing becomes a reality

  14. Simulated coal gas MCFC power plant system verification. Final report

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  15. Parameterized Verification of Graph Transformation Systems with Whole Neighbourhood Operations

    Delzanno, Giorgio; Stückrath, Jan

    2014-01-01

    We introduce a new class of graph transformation systems in which rewrite rules can be guarded by universally quantified conditions on the neighbourhood of nodes. These conditions are defined via special graph patterns which may be transformed by the rule as well. For the new class for graph rewrite rules, we provide a symbolic procedure working on minimal representations of upward closed sets of configurations. We prove correctness and effectiveness of the procedure by a categorical presenta...

  16. Fault diagnosis for discrete event systems: Modelling and verification

    Simeu-Abazi, Zineb; Di Mascolo, Maria; Knotek, Michal

    2010-01-01

    This paper proposes an effective way for diagnosis of discrete-event systems using a timed-automaton. It is based on the model-checking technique, thanks to time analysis of the timed model. The paper proposes a method to construct all the timed models and details the different steps used to obtain the diagnosis path. A dynamic model with temporal transitions is proposed in order to model the system. By 'dynamical model', we mean an extension of timed automata for which the faulty states are identified. The model of the studied system contains the faultless functioning states and all the faulty states. Our method is based on the backward exploitation of the dynamic model, where all possible reverse paths are searched. The reverse path is the connection of the faulty state to the initial state. The diagnosis method is based on the coherence between the faulty occurrence time and the reverse path length. A real-world batch process is used to demonstrate the modelling steps and the proposed backward time analysis method to reach the diagnosis results.

  17. Behavioural Verification: Preventing Report Fraud in Decentralized Advert Distribution Systems

    Stylianos S. Mamais

    2017-11-01

    Full Text Available Service commissions, which are claimed by Ad-Networks and Publishers, are susceptible to forgery as non-human operators are able to artificially create fictitious traffic on digital platforms for the purpose of committing financial fraud. This places a significant strain on Advertisers who have no effective means of differentiating fabricated Ad-Reports from those which correspond to real consumer activity. To address this problem, we contribute an advert reporting system which utilizes opportunistic networking and a blockchain-inspired construction in order to identify authentic Ad-Reports by determining whether they were composed by honest or dishonest users. What constitutes a user’s honesty for our system is the manner in which they access adverts on their mobile device. Dishonest users submit multiple reports over a short period of time while honest users behave as consumers who view adverts at a balanced pace while engaging in typical social activities such as purchasing goods online, moving through space and interacting with other users. We argue that it is hard for dishonest users to fake honest behaviour and we exploit the behavioural patterns of users in order to classify Ad-Reports as real or fabricated. By determining the honesty of the user who submitted a particular report, our system offers a more secure reward-claiming model which protects against fraud while still preserving the user’s anonymity.

  18. The SAMS: Smartphone Addiction Management System and verification.

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  19. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Min, Moon Gi; Lee, Jae Ki; Lee, Kwang Hyun; Lee, Dong Il; Lim, Hee Taek [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2017-08-15

    Distributed Control System (DCS) communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units) must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs)

  20. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  1. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    MCGREW, D.L.

    1999-01-01

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  2. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations

  3. Robust control design verification using the modular modeling system

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  4. ALGORITHM VERIFICATION FOR A TLD PERSONAL DOSIMETRY SYSTEM

    SHAHEIN, A.; SOLIMAN, H.A.; MAGHRABY, A.

    2008-01-01

    Dose algorithms are used in thermoluminescence personnel dosimetry for the interpretation of the dosimeter response in terms of equivalent dose. In the present study, an automated Harshaw 6600 reader was vigorously tested prior to the use for dose calculation algorithm according to the standard established by the US Department of Energy Laboratory Accreditation Program (DOELAP). Also, manual Harshaw 4500 reader was used along with the ICRU slab phantom and the RANDO phantom in experimentally determining the photon personal doses in terms of deep dose; Hp(10), shallow dose; Hp(0.07), and eye lens dose; Hp(3). Also, a Monte Carlo simulation program (VMC-dc) free code was used to simulate RANDO phantom irradiation process. The accuracy of the automated system lies well within DOELAP tolerance limits in all test categories

  5. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  6. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  7. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  8. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  9. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M.

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  10. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  11. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  12. Fuzzy Controllers for a Gantry Crane System with Experimental Verifications

    Naif B. Almutairi

    2016-01-01

    Full Text Available The control problem of gantry cranes has attracted the attention of many researchers because of the various applications of these cranes in the industry. In this paper we propose two fuzzy controllers to control the position of the cart of a gantry crane while suppressing the swing angle of the payload. Firstly, we propose a dual PD fuzzy controller where the parameters of each PD controller change as the cart moves toward its desired position, while maintaining a small swing angle of the payload. This controller uses two fuzzy subsystems. Then, we propose a fuzzy controller which is based on heuristics. The rules of this controller are obtained taking into account the knowledge of an experienced crane operator. This controller is unique in that it uses only one fuzzy system to achieve the control objective. The validity of the designed controllers is tested through extensive MATLAB simulations as well as experimental results on a laboratory gantry crane apparatus. The simulation results as well as the experimental results indicate that the proposed fuzzy controllers work well. Moreover, the simulation and the experimental results demonstrate the robustness of the proposed control schemes against output disturbances as well as against uncertainty in some of the parameters of the crane.

  13. Design, analysis, and test verification of advanced encapsulation systems

    Garcia, A.; Minning, C.

    1981-01-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  14. Design, analysis, and test verification of advanced encapsulation systems

    Garcia, A.; Minning, C.

    1981-11-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  15. Nondestructive verification and assay systems for spent fuels. Technical appendixes

    Cobb, D.D.; Phillips, J.R.; Baker, M.P.

    1982-04-01

    Six technical appendixes are presented that provide important supporting technical information for the study of the application of nondestructive measurements to spent-fuel storage. Each appendix addresses a particular technical subject in a reasonably self-contained fashion. Appendix A is a comparison of spent-fuel data predicted by reactor operators with measured data from reprocessors. This comparison indicates a rather high level of uncertainty in previous burnup calculations. Appendix B describes a series of nondestructive measurements at the GE-Morris Operation Spent-Fuel Storage Facility. This series of experiments successfully demonstrated a technique for reproducible positioning of fuel assemblies for nondestructive measurement. The experimental results indicate the importance of measuring the axial and angular burnup profiles of irradiated fuel assemblies for quantitative determination of spent-fuel parameters. Appendix C is a reasonably comprehensive bibliography of reports and symposia papers on spent-fuel nondestructive measurements to April 1981. Appendix D is a compendium of spent-fuel calculations that includes isotope production and depletion calculations using the EPRI-CINDER code, calculations of neutron and gamma-ray source terms, and correlations of these sources with burnup and plutonium content. Appendix E describes the pulsed-neutron technique and its potential application to spent-fuel measurements. Although not yet developed, the technique holds the promise of providing separate measurements of the uranium and plutonium fissile isotopes. Appendix F describes the experimental program and facilities at Los Alamos for the development of spent-fuel nondestructive measurement systems. Measurements are reported showing that the active neutron method is sensitive to the replacement of a single fuel rod with a dummy rod in an unirradiated uranium fuel assembly

  16. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  17. A system for deduction-based formal verification of workflow-oriented software models

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  18. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji [Department of Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2016-04-15

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  19. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-01-01

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  20. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  1. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  2. VerifEYE: a real-time meat inspection system for the beef processing industry

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  3. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  4. Verification and validation as an integral part of the development of digital systems for nuclear applications

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  5. Research on database realization technology of seismic information system in CTBT verification

    Zheng Xuefeng; Shen Junyi; Zhang Huimin; Jing Ping; Sun Peng; Zheng Jiangling

    2005-01-01

    Developing CTBT verification technology has become the most important method that makes sure CTBT to be fulfilled conscientiously. The seismic analysis based on seismic information system (SIS) is playing an important rule in this field. Based on GIS, the SIS will be very sufficient and powerful in spatial analysis, topologic analysis and visualization. However, the critical issue to implement the whole system function depends on the performance of SIS DB. Based on the ArcSDE Geodatabase data model, not only have the spatial data and attribute data seamless integrated management been realized with RDBMS ORACLE really, but also the most functions of ORACLE have been reserved. (authors)

  6. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  7. Inspector measurement verification activities

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  8. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  9. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  10. Design Verification Enhancement of FPGA-based Plant Protection System Trip Logics for Nuclear Power Plant

    Ahmed, Ibrahim; Jung, Jae Cheon; Heo, Gyun Young

    2016-01-01

    As part of strengthening the application of FPGA technology and find solution to its challenges in NPPs, international atomic energy agency (IAEA) has indicated interest by joining sponsorship of Topical Group on FPGA Applications in NPPs (TG-FAN) that hold meetings up to 7th times until now, in form of workshop (International workshop on the application of FPGAs in NPPs) annually since 2008. The workshops attracted a significant interest and had a broad representation of stakeholders such as regulators, utilities, research organizations, system designers, and vendors, from various countries that converge to discuss the current issues regarding instrumentation and control (I and C) systems as well as FPGA applications. Two out of many technical issues identified by the group are lifecycle of FPGA-based platforms, systems, and applications; and methods and tools for V and V. Therefore, in this work, several design steps that involved the use of model-based systems engineering process as well as MATLAB/SIMULINK model which lead to the enhancement of design verification are employed. The verified and validated design output works correctly and effectively. Conclusively, the model-based systems engineering approach and the structural step-by-step design modeling techniques including SIMULINK model utilized in this work have shown how FPGA PPS trip logics design verification can be enhanced. If these design approaches are employ in the design of FPGA-based I and C systems, the design can be easily verified and validated

  11. A standardized approach to verification and validation to assist in expert system development

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  12. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  13. Verification and testing of the RTOS for safety-critical embedded systems

    Lee, Na Young [Seoul National University, Seoul (Korea, Republic of); Kim, Jin Hyun; Choi, Jin Young [Korea University, Seoul (Korea, Republic of); Sung, Ah Young; Choi, Byung Ju [Ewha Womans University, Seoul (Korea, Republic of); Lee, Jang Soo [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system.

  14. Verification and testing of the RTOS for safety-critical embedded systems

    Lee, Na Young; Kim, Jin Hyun; Choi, Jin Young; Sung, Ah Young; Choi, Byung Ju; Lee, Jang Soo

    2003-01-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system

  15. Orion GN&C Fault Management System Verification: Scope And Methodology

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  16. Experimental study on design verification of new concept for integral reactor safety system

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  17. Flexible prototype of modular multilevel converters for experimental verification of DC transmission and multiterminal systems

    Konstantinou, Georgios; Ceballos, Salvador; Gabiola, Igor

    2017-01-01

    Testing and verification of high-level and low-level control, modulation, fault handling and converter co-ordination for modular multilevel converters (MMCs) requires development of experimental prototype converters. In this paper, we provide a a complete overview of the MMC-based experimental...... prototype at UNSW Sydney (The University of New South Wales) including the structure of the sub-modules, communication, control and protection functions as well as the possible configurations of the system. The prototype, rated at a dc voltage of up to 800 V and power of 20 kVA and can be used to study...

  18. Verification of Treatment Planning System (TPS) on Beam Axis of Co-60 Teletherapy

    Nunung-Nuraeni; Budhy-Kurniawan; Purwanto; Sugiyantari; Heru-Prasetio; Nasukha

    2001-01-01

    Cancer diseases up to now can be able to be treated by using surgery, chemotherapy and radiotherapy. The need of high level precision and accuracy on radiation dose are very important task. One of task is verification of Treatment Planning System (Tps) to the treatment of patients. The research has been done to verify Tps on beam exis of teletherapy Co-60. Result found that the different between Tps and measurements are about -2.682 % to 1.918% for simple geometry and homogeneous material, 5.278 % to 4.990 % for complex geometry, and -3.202 % to -2.090 % for more complex geometry. (author)

  19. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  20. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  1. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  2. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  3. Formal specification and verification of interactive systems with plasticity: Applications to nuclear-plant supervision

    Oliveira, Raquel Araujo de

    2015-01-01

    The advent of ubiquitous computing and the increasing variety of platforms and devices change user expectations in terms of user interfaces. Systems should be able to adapt themselves to their context of use, i.e., the platform (e.g. a PC or a tablet), the users who interact with the system (e.g. administrators or regular users), and the environment in which the system executes (e.g. a dark room or outdoor). The capacity of a UI to withstand variations in its context of use while preserving usability is called plasticity. Plasticity provides users with different versions of a UI. Although it enhances UI capabilities, plasticity adds complexity to the development of user interfaces: the consistency between multiple versions of a given UI should be ensured. Given the large number of possible versions of a UI, it is time-consuming and error prone to check these requirements by hand. Some automation must be provided to verify plasticity.This complexity is further increased when it comes to UIs of safety-critical systems. Safety-critical systems are systems in which a failure has severe consequences. The complexity of such systems is reflected in the UIs, which are now expected not only to provide correct, intuitive, non-ambiguous and adaptable means for users to accomplish a goal, but also to cope with safety requirements aiming to make sure that systems are reasonably safe before they enter the market. Several techniques to ensure quality of systems in general exist, which can also be used to safety-critical systems. Formal verification provides a rigorous way to perform verification, which is suitable for safety-critical systems. Our contribution is an approach to verify safety-critical interactive systems provided with plastic UIs using formal methods. Using a powerful tool-support, our approach permits:-The verification of sets of properties over a model of the system. Using model checking, our approach permits the verification of properties over the system formal

  4. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  5. A microcomputer system for prescription, calculation, verification and recording of radiotherapy treatments

    Morrey, D.; Smith, C.W.; Belcher, R.A.; Harding, T.; Sutherland, W.H.

    1982-01-01

    The design of a microcomputer system for the reduction of mistakes in radiotherapy is described. The system covers prescription entry, prescription and treatment calculations, and verification and recording of the treatment set-up. A telecobalt unit was interfaced to the system and in the first 12 months 400 patients have been prescribed and 5000 treatment fields verified. The prescription is entered by the medical officer using an interactive program and this prescription provides the reference for verifying the treatment set-up. The program allows amendments to the prescription to be made easily during the treatment course. The treatment parameters verified are field size, wedge and treatment time. The system uses bar-codes for patient and field identification. A reduction in the number of mistakes has been achieved and future developments are discussed. (author)

  6. SELF: expert system for supporting verification of network operating constraints in power transmission planning

    Cicoria, R; Migliardi, P [Ente Nazionale per l` Energia Elettrica, Milan (Italy); Pogliano, P [Centro Informazioni Studi Esperienze (CISE), Milan (Italy)

    1995-06-01

    Performing planned studies into very large HV transmission systems is a very complex task which requires the use of simulation models and the application of the heuristic acquired by expert palnners during previous studies. The ENEL Electric Research Center and the CISE Artificial Intelligence Section have developed a knowledge-based system, named SELF, which is aimed at supporting the transmission system palnner. SELF is capable of assisting the engineer both in finding the convergence of the load flow calculation and determining solutions that respect active power, voltage and VAR operating constraints. This paper describes the overall architecture of the system and shows its integration in a larger planning environment called SPIRA, currently utilized at ENEL. More details are given on the least completed modules, the redispatching and network reinforcement subsystems which deal with active power constraint verification.

  7. Independent verification of monitor unit calculation for radiation treatment planning system.

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  8. Verification of criticality safety in on-site spent fuel storage systems

    Rasmussen, R.W.

    1989-01-01

    On February 15, 1984, Duke Power Company received approval for a two-region, burnup credit, spent fuel storage rack design at both Units 1 and 2 of the McGuire Nuclear Station. Duke also hopes to obtain approval by January of 1990 for a dry spent fuel storage system at the Oconee Nuclear Station, which will incorporate the use of burnup credit in the criticality analysis governing the design of the individual storage units. While experiences in burnup verification for criticality safety for their dry storage system at Oconee are in the future, the methods proposed for burnup verification will be similar to those currently used at the McGuire Nuclear Station in the two-region storage racks installed in both pools. In conclusion, the primary benefit of the McGuire rerack effort has obviously been the amount of storage expansion it provided. A total increase of about 2,000 storage cells was realized, 1,000 of which were the result of pursuing the two-region rather than the conventional poison rack design. Less impacting, but equally as important, however, has been the experience gained during the planning, installation, and operation of these storage racks. This experience should prove useful for future rerack efforts likely to occur at Duke's Catawba Nuclear Station as well as for the current dry storage effort underway for the Oconee Nuclear Station

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  10. Verification of IMRT dose distributions using a water beam imaging system

    Li, J.S.; Boyer, Arthur L.; Ma, C.-M.

    2001-01-01

    A water beam imaging system (WBIS) has been developed and used to verify dose distributions for intensity modulated radiotherapy using dynamic multileaf collimator. This system consisted of a water container, a scintillator screen, a charge-coupled device camera, and a portable personal computer. The scintillation image was captured by the camera. The pixel value in this image indicated the dose value in the scintillation screen. Images of radiation fields of known spatial distributions were used to calibrate the device. The verification was performed by comparing the image acquired from the measurement with a dose distribution from the IMRT plan. Because of light scattering in the scintillator screen, the image was blurred. A correction for this was developed by recognizing that the blur function could be fitted to a multiple Gaussian. The blur function was computed using the measured image of a 10 cmx10 cm x-ray beam and the result of the dose distribution calculated using the Monte Carlo method. Based on the blur function derived using this method, an iterative reconstruction algorithm was applied to recover the dose distribution for an IMRT plan from the measured WBIS image. The reconstructed dose distribution was compared with Monte Carlo simulation result. Reasonable agreement was obtained from the comparison. The proposed approach makes it possible to carry out a real-time comparison of the dose distribution in a transverse plane between the measurement and the reference when we do an IMRT dose verification

  11. Integrated verification and testing system (IVTS) for HAL/S programs

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  12. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  13. Issues of verification and validation of application-specific integrated circuits in reactor trip systems

    Battle, R.E.; Alley, G.T.

    1993-01-01

    Concepts of using application-specific integrated circuits (ASICs) in nuclear reactor safety systems are evaluated. The motivation for this evaluation stems from the difficulty of proving that software-based protection systems are adequately reliable. Important issues concerning the reliability of computers and software are identified and used to evaluate features of ASICS. These concepts indicate that ASICs have several advantages over software for simple systems. The primary advantage of ASICs over software is that verification and validation (V ampersand V) of ASICs can be done with much higher confidence than can be done with software. A method of performing this V ampersand V on ASICS is being developed at Oak Ridge National Laboratory. The purpose of the method's being developed is to help eliminate design and fabrication errors. It will not solve problems with incorrect requirements or specifications

  14. Research on MRV system of iron and steel industry and verification mechanism establishment in China

    Guo, Huiting; Chen, Liang; Chen, Jianhua

    2017-12-01

    The national carbon emissions trading market will be launched in 2017 in China. The iron and steel industry will be covered as one of the first industries. Establishing its MRV system is critical to promote the development of the iron and steel industry in the carbon trading market. This paper studies the requirements and procedures of the accounting, monitoring, reporting and verification of the seven iron and steel industry carbon trading pilots. The construction and operating mechanism of the MRV systems are also analyzed. Combining with the emission feature of the iron and steel industry, we study the suitable national MRV system for the whole iron and steel industry to consummate the future national carbon trading framework of iron and steel industry.

  15. An automated portal verification system for the tangential breast portal field

    Yin, F.-F.; Lai, W.; Chen, C. W.; Nelson, D. F.

    1995-01-01

    Purpose/Objective: In order to ensure the treatment is delivered as planned, a portal image is acquired in the accelerator and is compared to the reference image. At present, this comparison is performed by radiation oncologists based on the manually-identified features, which is both time-consuming and potentially error-prone. With the introduction of various electronic portal imaging devices, real-time patient positioning correction is becoming clinically feasible to replace time-delayed analysis using films. However, this procedure requires present of radiation oncologists during patient treatment which is not cost-effective and practically not realistic. Therefore, the efficiency and quality of radiation therapy could be substantially improved if this procedure can be automated. The purpose of this study is to develop a fully computerized verification system for the radiation therapy of breast cancer for which a similar treatment setup is generally employed. Materials/Methods: The automated verification system involves image acquisition, image feature extraction, feature correlation between reference and portal images, and quantitative evaluation of patient setup. In this study, a matrix liquid ion-chamber EPID was used to acquire digital portal images which is directly attached to Varian CL2100C accelerator. For effective use of computation memory, the 12-bit gray levels in original portal images were quantized to form a range of 8-bit gray levels. A typical breast portal image includes three important components: breast and lung tissues in the treatment field, air space within the treatment field, and non-irradiated region. A hierarchical region processing technique was developed to separate these regions sequentially. The inherent hierarchical features were formulated based on different radiation attenuation for different regions as: treatment field edge -- breast skin line -- chest wall. Initially, a combination of a Canny edge detector and a constrained

  16. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  17. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  18. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  19. Verification of absorbed dose calculation with XIO Radiotherapy Treatment Planning System

    Bokulic, T.; Budanec, M.; Frobe, A.; Gregov, M.; Kusic, Z.; Mlinaric, M.; Mrcela, I.

    2013-01-01

    Modern radiotherapy relies on computerized treatment planning systems (TPS) for absorbed dose calculation. Most TPS require a detailed model of a given machine and therapy beams. International Atomic Energy Agency (IAEA) recommends acceptance testing for the TPS (IAEA-TECDOC-1540). In this study we present customization of those tests for measurements with the purpose of verification of beam models intended for clinical use in our department. Elekta Synergy S linear accelerator installation and data acquisition for Elekta CMS XiO 4.62 TPS was finished in 2011. After the completion of beam modelling in TPS, tests were conducted in accordance with the IAEA protocol for TPS dose calculation verification. The deviations between the measured and calculated dose were recorded for 854 points and 11 groups of tests in a homogenous phantom. Most of the deviations were within tolerance. Similar to previously published results, results for irregular L shaped field and asymmetric wedged fields were out of tolerance for certain groups of points.(author)

  20. Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems

    Nguyen, Nhan T.

    2018-01-01

    Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.

  1. Model-based verification method for solving the parameter uncertainty in the train control system

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  2. A standardized approach to verification and validation to assist in expert system development

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  3. Embedded software verification and debugging

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  4. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  5. Remaining Sites Verification Package for the 1607-F3 Sanitary Sewer System, Waste Site Reclassification Form 2006-047

    L. M. Dittmer

    2007-04-26

    The 1607-F3 waste site is the former location of the sanitary sewer system that supported the 182-F Pump Station, the 183-F Water Treatment Plant, and the 151-F Substation. The sanitary sewer system included a septic tank, drain field, and associated pipeline, all in use between 1944 and 1965. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  6. Joint verification project on environmentally friendly coal utilization systems. Joint verification project on the water-saving coal preparation system; Kankyo chowagata sekitan riyo system kyodo jissho jigyo. Shosuigata sentan system kyodo jissho jigyo

    NONE

    1995-09-01

    In this verification project, clean technology which should be spread in China was verified and the base structure for its spread was prepared for the purpose of controlling emissions of environmental pollutants associated with the coal utilization in China and of contributing to secure energy acquisition of Japan. As joint verification projects, a general rehabilitation type coal preparation system was installed in the Wangfenggang coal preparation plant, and a central control coal preparation system was installed in the Qingtan coal preparation plant. In the former, a system is verified in which optimum operation, water-saving, high quality, and heightening of efficiency can be obtained by introducing two computing systems for operation control and quality control, various measuring instruments, and analyzers to coal preparation plants where analog operation is conducted helped by Russia and Porland and have problems about quality control. In the latter, a central control system achieving water saving is verified by introducing rapid ash meters, scales, desitometers and computers to coal preparation plants having zigzag or heavy-fluid cyclon and connecting various kinds of information through network. For fiscal 1994, investigation and study were conducted. 51 figs., 9 tabs.

  7. FUSION DECISION FOR A BIMODAL BIOMETRIC VERIFICATION SYSTEM USING SUPPORT VECTOR MACHINE AND ITS VARIATIONS

    A. Teoh

    2017-12-01

    Full Text Available This paw presents fusion detection technique comparisons based on support vector machine and its variations for a bimodal biometric verification system that makes use of face images and speech utterances. The system is essentially constructed by a face expert, a speech expert and a fusion decision module. Each individual expert has been optimized to operate in automatic mode and designed for security access application. Fusion decision schemes considered are linear, weighted Support Vector Machine (SVM and linear SVM with quadratic transformation. The conditions tested include the balanced and unbalanced conditions between the two experts in order to obtain the optimum fusion module from  these techniques best suited to the target application.

  8. Method and practice on safety software verification and validation for digital reactor protection system

    Li Duo; Zhang Liangju; Feng Junting

    2010-01-01

    The key issue arising from digitalization of reactor protection system for Nuclear Power Plant (NPP) is in essence, how to carry out Verification and Validation (V and V), to demonstrate and confirm the software is reliable enough to perform reactor safety functions. Among others the most important activity of software V and V process is unit testing. This paper discusses the basic concepts on safety software V and V and the appropriate technique for software unit testing, focusing on such aspects as how to ensure test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed herein was successfully used in the work of unit testing on safety software of a digital reactor protection system. (author)

  9. Verification of HELIOS-MASTER system through benchmark of Halden boiling water reactor (HBWR)

    Kim, Ha Yong; Song, Jae Seung; Cho, Jin Young; Kim, Kang Seok; Lee, Chung Chan; Zee, Sung Quun

    2004-01-01

    To verify the HELIOS-MASTER computer code system for a nuclear design, we have been performed benchmark calculations for various reactor cores. The Halden reactor is a boiling, heavy water moderated reactor. At a full power of 18-20MWt, the moderator temperature is 240 .deg. C and the pressure is 33 bar. This study describes the verification of the HELIOS-MASTER computer code system for a nuclear design and the analysis of a hexagonal and D 2 O moderated core through a benchmark of the Halden reactor core. HELIOS, developed by Scandpower A/S, is a two-dimensional transport program for the generation of group cross-sections, and MASTER, developed by KAERI, is a three-dimensional nuclear design and analysis code based on the two-group diffusion theory. It solves the neutronics model with the TPEN (Triangle based Polynomial Expansion Nodal) method for a hexagonal geometry

  10. A digital fluoroscopic imaging system for verification during external beam radiotherapy

    Takai, Michikatsu

    1990-01-01

    A digital fluoroscopic (DF) imaging system has been constructed to obtain portal images for verification during external beam radiotherapy. The imaging device consists of a fluorescent screen viewed by a highly sensitive video camera through a mirror. The video signal is digitized and processed by an image processor which is linked on-line with a host microcomputer. The image quality of the DF system was compared with that of film for portal images of the Burger phantom and the Alderson anthropomorphic phantom using 10 MV X-rays. Contrast resolution of the DF image integrated for 8.5 sec. was superior to the film resolution, while spatial resolution was slightly inferior. The DF image of the Alderson phantom processed by the adaptive histogram equalization was better in showing anatomical landmarks than the film portal image. The DF image integrated for 1 sec. which is used for movie mode can show patient movement during treatment. (author)

  11. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  12. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  13. Clinical evaluation of a mobile digital specimen radiography system for intraoperative specimen verification.

    Wang, Yingbing; Ebuoma, Lilian; Saksena, Mansi; Liu, Bob; Specht, Michelle; Rafferty, Elizabeth

    2014-08-01

    Use of mobile digital specimen radiography systems expedites intraoperative verification of excised breast specimens. The purpose of this study was to evaluate the performance of a such a system for verifying targets. A retrospective review included 100 consecutive pairs of breast specimen radiographs. Specimens were imaged in the operating room with a mobile digital specimen radiography system and then with a conventional digital mammography system in the radiology department. Two expert reviewers independently scored each image for image quality on a 3-point scale and confidence in target visualization on a 5-point scale. A target was considered confidently verified only if both reviewers declared the target to be confidently detected. The 100 specimens contained a total of 174 targets, including 85 clips (49%), 53 calcifications (30%), 35 masses (20%), and one architectural distortion (1%). Although a significantly higher percentage of mobile digital specimen radiographs were considered poor quality by at least one reviewer (25%) compared with conventional digital mammograms (1%), 169 targets (97%), were confidently verified with mobile specimen radiography; 172 targets (98%) were verified with conventional digital mammography. Three faint masses were not confidently verified with mobile specimen radiography, and conventional digital mammography was needed for confirmation. One faint mass and one architectural distortion were not confidently verified with either method. Mobile digital specimen radiography allows high diagnostic confidence for verification of target excision in breast specimens across target types, despite lower image quality. Substituting this modality for conventional digital mammography can eliminate delays associated with specimen transport, potentially decreasing surgical duration and increasing operating room throughput.

  14. Practical experience with a local verification system for containment and surveillance sensors

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  15. Verification and validation issues for digitally-based NPP safety systems

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  16. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  17. Standard artifact for the geometric verification of terrestrial laser scanning systems

    González-Jorge, H.; Riveiro, B.; Armesto, J.; Arias, P.

    2011-10-01

    Terrestrial laser scanners are geodetic instruments with applications in areas such as architecture, civil engineering or environment. Although it is common to receive the technical specifications of the systems from their manufacturers, there are not any solutions for data verification in the market available for the users. This work proposes a standard artifact and a methodology to perform, in a simple way, the metrology verification of laser scanners. The artifact is manufactured using aluminium and delrin, materials that make the artifact robust and portable. The system consists of a set of five spheres situated at equal distances to one another, and a set of seven cubes of different sizes. A coordinate measuring machine with sub-millimetre precision is used for calibration purposes under controlled environmental conditions. After its calibration, the artifact can be used for the verification of metrology specifications given by manufacturers of laser scanners. The elements of the artifact are destinated to test different metrological characteristics, such as accuracy, precision and resolution. The distance between centres of the spheres is used to obtain the accuracy data, the standard deviation of the top face of the largest cube is used to establish the precision (repeatability) and the error in the measurement of the cubes provides the resolution value in axes X, Y and Z. Methodology for the evaluation is mainly supported by least squares fitting algorithms developed using Matlab programming. The artifact and methodology proposed were tested using a terrestrial laser scanner Riegl LMSZ-390i at three different ranges (10, 30 and 50 m) and four stepwidths (0.002°, 0.005°, 0.010° and 0.020°), both for horizontal and vertical displacements. Results obtained are in agreement with the accuracy and precision data given by the manufacturer, 6 and 4 mm, respectively. On the other hand, important influences between resolution and range and between resolution and

  18. Verification of Strength of the Welded Joints by using of the Aramis Video System

    Pała Tadeusz

    2017-03-01

    Full Text Available In the paper are presented the results of strength analysis for the two types of the welded joints made according to conventional and laser technologies of high-strength steel S960QC. The hardness distributions, tensile properties and fracture toughness were determined for the weld material and heat affect zone material for both types of the welded joints. Tests results shown on advantage the laser welded joints in comparison to the convention ones. Tensile properties and fracture toughness in all areas of the laser joints have a higher level than in the conventional one. The heat affect zone of the conventional welded joints is a weakness area, where the tensile properties are lower in comparison to the base material. Verification of the tensile tests, which carried out by using the Aramis video system, confirmed this assumption. The highest level of strains was observed in HAZ material and the destruction process occurred also in HAZ of the conventional welded joint.

  19. Verification and implications of the multiple pin treatment in the SASSYS-1 LMR systems analysis code

    Dunn, F.E.

    1994-01-01

    As part of a program to obtain realistic, as opposed to excessively conservative, analysis of reactor transients, a multiple pin treatment for the analysis of intra-subassembly thermal hydraulics has been included in the SASSYS-1 liquid metal reactor systems analysis code. This new treatment has made possible a whole new level of verification for the code. The code can now predict the steady-state and transient responses of individual thermocouples within instrumented subassemlies in a reactor, rather than just predicting average temperatures for a subassembly. Very good agreement has been achieved between code predictions and the experimental measurements of steady-state and transient temperatures and flow rates in the Shutdown Heat Removal Tests in the EBR-II Reactor. Detailed multiple pin calculations for blanket subassemblies in the EBR-II reactor demonstrate that the actual steady-state and transient peak temperatures in these subassemblies are significantly lower than those that would be calculated by simpler models

  20. Beam intensity scanner system for three dimensional dose verification of IMRT

    Vahc, Young W.; Kwon, Ohyun; Park, Kwangyl; Park, Kyung R.; Yi, Byung Y.; Kim, Keun M.

    2003-01-01

    Patient dose verification is clinically one of the most important parts in the treatment delivery of radiation therapy. The three dimensional (3D) reconstruction of dose distribution delivered to target volume helps to verify patient dose and determine the physical characteristics of beams used in IMRT. Here we present beam intensity scanner (BInS) system for the pre-treatment dosimetric verification of two dimensional photon intensity. The BInS is a radiation detector with a custom-made software for dose conversion of fluorescence signals from scintillator. The scintillator is used to produce fluorescence from the irradiation of 6 MV photons on a Varian Clinac 21EX. The digitized fluoroscopic signals obtained by digital video camera-based scintillator (DVCS) will be processed by our custom made software to reproduce 3D- relative dose distribution. For the intensity modulated beam (IMB), the BInS calculates absorbed dose in absolute beam fluence which is used for the patient dose distribution. Using BInS, we performed various measurements related to IMRT and found the following: (1) The 3D-dose profiles of the IMBs measured by the BInS demonstrate good agreement with radiographic film, pin type ionization chamber and Monte Carlo simulation. (2) The delivered beam intensity is altered by the mechanical and dosimetric properties of the collimation of dynamic and/or step MLC system. This is mostly due to leaf transmission, leaf penumbra scattered photons from the round edges of leaves, and geometry of leaf. (3) The delivered dose depends on the operational detail of how to make multi leaf opening. These phenomena result in a fluence distribution that can be substantially different from the initial and calculated intensity modulation and therefore, should be taken into account by the treatment planning for accurate dose calculations delivered to the target volume in IMRT. (author)

  1. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  2. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--CAPSTONE 30KW MICROTURBINE SYSTEM

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: RESIDENTIAL ELECTRIC POWER GENERATION USING THE PLUG POWER SU1 FUEL CELL SYSTEM

    The Environmental Technology Verification report discusses the technology and performance of the Plug Power SU1 Fuel Cell System manufactured by Plug Power. The SU1 is a proton exchange membrane fuel cell that requires hydrogen (H2) as fuel. H2 is generally not available, so the ...

  5. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  6. Density scaling of phantom materials for a 3D dose verification system.

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.To facilitate this verification, JCH repre...

  8. Verification and validation guidelines for high integrity systems: Appendices A--D, Volume 2

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    The following material is furnished as an experimental guide for the use of risk based classification for nuclear plant protection systems. As shown in Sections 2 and 3 of this report, safety classifications for the nuclear field are application based (using the function served as the primary criterion), whereas those in use by the process industry and the military are risk based. There are obvious obstacles to the use of risk based classifications (and the associated integrity levels) for nuclear power plants, yet there are also many potential benefits, including: it considers all capabilities provided for dealing with a specific hazard, thus assigning a lower risk where multiple protection is provided (either at the same or at lower layers); this permits the plant management to perform trade-offs between systems that meet the highest qualification levels or multiple diverse systems at lower qualification levels; it motivates the use (and therefore also the development) of protection systems with demonstrated low failure probability; and it may permit lower cost process industry equipment of an established integrity level to be used in nuclear applications (subject to verification of the integrity level and regulatory approval). The totality of these benefits may reduce the cost of digital protection systems significantly an motivate utilities to much more rapid upgrading of the capabilities than is currently the case. Therefore the outline of a risk based classification is presented here, to serve as a starting point for further investigation and possible trial application

  9. National Accounts Energy Alliance : Field test and verification of CHP components and systems

    Sweetser, R. [Exergy Partners Corporation, Herndon, VA (United States)

    2003-07-01

    Exergy is a consulting firm which specializes in capitalizing on opportunities that result from the nexus of utility deregulation and global climate change in both the construction and energy industries. The firm offers assistance in technical business and market planning, product development and high impact marketing and technology transfer programs. The author discussed National Accounts Energy Alliance (NAEA) program on distributed energy resources (DER) and identified some advantageous areas such as homeland security (less possible terrorist targets to be protected), food safety (protection of food supply and delivery system), reliability, power quality, energy density, grid congestion and energy price. In the future, an essential role in moderating energy prices for commercial buildings will probably be played by distributed generation (DG) and combined heat and power (CHP). The technical merits of these technologies is being investigated by national accounts and utilities partnering with non-profit organizations, the United States Department of Energy (US DOE), state governments and industry. In that light, in 2001 an Alliance program was developed, which allows investors to broaden their knowledge from the application and verification of Advanced Energy Technologies. This program was the result of a synergy between the American Gas Foundation and the Gas Technology Institute (GTI), and it assists investors with their strategic planning. It was proven that a customer-led Energy Technology Test and Verification Program (TA and VP) could be cost-effective and successful. The NAEA activities in five locations were reviewed and discussed. They were: (1) Russell Development, Portland, Oregon; (2) A and P-Waldbaums, Hauppage, New York; (3) HEB, Southern, Texas; (4) Cinemark, Plano, Texas; and McDonald's, Tampa, Florida. 4 tabs., figs.

  10. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC films. A semiquantitative comparison.

    Geyer, Peter; Blank, Hilbert; Alheit, Horst

    2006-03-01

    The suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. The comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC film as well, using the EC-L cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. In general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. The ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures.

  11. Design verification and acceptance tests of the ASST-A helium refrigeration system

    Ganni, V.; Apparao, T.V.V.R.

    1993-07-01

    Three similar helium refrigerator systems have been installed at the Superconducting Super Collider Laboratory (SSCL) N15 site; the ASST-A system, which will be used for the accelerator system's full cell string test; the N15-B system, which will be used for string testing in the tunnel; and a third plant, dedicated to magnet testing at the Magnet Testing Laboratory. The ASST-A and N15-B systems will ultimately be a part of the collider's N15 sector station equipment. Each of these three systems has many subsystems, but the design basis for the main refrigerator is the same. Each system has a guaranteed capacity of 2000 W of refrigeration and 20 g/s liquefaction at 4.5K. The testing and design verification of the ASST-A refrigeration system consisted of parametric tests on the compressors and the total system. A summary of the initial performance test data is given in this paper. The tests were conducted for two cases: in the first, all four compressors were operating; in the second, only one compressor in each stage was operating. In each case, tests were conducted in three modes of operation described later on. The process design basis supplied by the manufacturers and used in the design of the main components -- the compressor, and expanders and heat exchangers for the coldbox -- were used to reduce the actual test data using process simulation methodology. In addition, the test results and the process design submitted by the manufacturer were analyzed using exergy analysis. This paper presents both the process and the exergy analyses of the manufacturer's design and the actual test data for Case 1. The process analyses are presented in the form of T-S diagrams. The results of the exergy analyses comparing the exergy losses of each component and the total system for the manufacturer's design and the test data are presented in the tables

  12. Burnup verification measurements at a US nuclear utility using the FORK measurement system

    Ewing, R.I.; Bosler, G.E.; Walden, G.

    1993-01-01

    The FORK measurement system, designed at Los Alamos National Laboratory (LANL) for the International Atomic Energy Agency (IAEA) safeguards program, has been used to examine spent reactor fuel assemblies at Duke Power Company's Oconee Nuclear Station. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. These measurements can be correlated with burnup and cooling time, and can be used to verify the reactor site records. Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. By taking into account the reduced reactivity of spent fuel due to its burnup in the reactor, burnup credit results in more efficient and economic transport and storage. The objectives of these tests are to demonstrate the applicability of the FORK system to verify reactor records and to develop optimal procedures compatible with utility operations. The test program is a cooperative effort supported by Sandia National Laboratories, the Electric Power Research Institute (EPRI), Los Alamos National Laboratory, and the Duke Power Company

  13. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  14. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology, University of Washington, 1959 N. E. Pacific Street, Seattle, Washington 98195 (United States)

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  15. Software Verification and Validation Report for the 244-AR Vault Interim Stabilization Ventilation System

    YEH, T.

    2002-01-01

    This document reports on the analysis, testing and conclusions of the software verification and validation for the 244-AR Vault Interim Stabilization ventilation system. Automation control system will use the Allen-Bradley software tools for programming and programmable logic controller (PLC) configuration. The 244-AR Interim Stabilization Ventilation System will be used to control the release of radioactive particles to the environment in the containment tent, located inside the canyon of the 244-AR facility, and to assist the waste stabilization efforts. The HVAC equipment, ducts, instruments, PLC hardware, the ladder logic executable software (documented code), and message display terminal are considered part of the temporary ventilation system. The system consists of a supply air skid, temporary ductwork (to distribute airflow), and two skid-mounted, 500-cfm exhausters connected to the east filter building and the vessel vent system. The Interim Stabilization Ventilation System is a temporary, portable ventilation system consisting of supply side and exhaust side. Air is supplied to the containment tent from an air supply skid. This skid contains a constant speed fan, a pre-filter, an electric heating coil, a cooling coil, and a constant flow device (CFD). The CFD uses a passive component that allows a constant flow of air to pass through the device. Air is drawn out of the containment tent, cells, and tanks by two 500-cfm exhauster skids running in parallel. These skids are equipped with fans, filters, stack, stack monitoring instrumentation, and a PLC for control. The 500CFM exhaust skids were fabricated and tested previously for saltwell pumping activities. The objective of the temporary ventilation system is to maintain a higher pressure to the containment tent, relative to the canyon and cell areas, to prevent contaminants from reaching the containment tent

  16. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  17. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons

    2009-07-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  18. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  19. Image acquisition optimization of a limited-angle intrafraction verification (LIVE) system for lung radiotherapy.

    Zhang, Yawei; Deng, Xinchen; Yin, Fang-Fang; Ren, Lei

    2018-01-01

    Limited-angle intrafraction verification (LIVE) has been previously developed for four-dimensional (4D) intrafraction target verification either during arc delivery or between three-dimensional (3D)/IMRT beams. Preliminary studies showed that LIVE can accurately estimate the target volume using kV/MV projections acquired over orthogonal view 30° scan angles. Currently, the LIVE imaging acquisition requires slow gantry rotation and is not clinically optimized. The goal of this study is to optimize the image acquisition parameters of LIVE for different patient respiratory periods and gantry rotation speeds for the effective clinical implementation of the system. Limited-angle intrafraction verification imaging acquisition was optimized using a digital anthropomorphic phantom (XCAT) with simulated respiratory periods varying from 3 s to 6 s and gantry rotation speeds varying from 1°/s to 6°/s. LIVE scanning time was optimized by minimizing the number of respiratory cycles needed for the four-dimensional scan, and imaging dose was optimized by minimizing the number of kV and MV projections needed for four-dimensional estimation. The estimation accuracy was evaluated by calculating both the center-of-mass-shift (COMS) and three-dimensional volume-percentage-difference (VPD) between the tumor in estimated images and the ground truth images. The robustness of LIVE was evaluated with varied respiratory patterns, tumor sizes, and tumor locations in XCAT simulation. A dynamic thoracic phantom (CIRS) was used to further validate the optimized imaging schemes from XCAT study with changes of respiratory patterns, tumor sizes, and imaging scanning directions. Respiratory periods, gantry rotation speeds, number of respiratory cycles scanned and number of kV/MV projections acquired were all positively correlated with the estimation accuracy of LIVE. Faster gantry rotation speed or longer respiratory period allowed less respiratory cycles to be scanned and less kV/MV projections

  20. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  1. ESTERR-PRO: A Setup Verification Software System Using Electronic Portal Imaging

    Pantelis A. Asvestas

    2007-01-01

    Full Text Available The purpose of the paper is to present and evaluate the performance of a new software-based registration system for patient setup verification, during radiotherapy, using electronic portal images. The estimation of setup errors, using the proposed system, can be accomplished by means of two alternate registration methods. (a The portal image of the current fraction of the treatment is registered directly with the reference image (digitally reconstructed radiograph (DRR or simulator image using a modified manual technique. (b The portal image of the current fraction of the treatment is registered with the portal image of the first fraction of the treatment (reference portal image by applying a nearly automated technique based on self-organizing maps, whereas the reference portal has already been registered with a DRR or a simulator image. The proposed system was tested on phantom data and on data from six patients. The root mean square error (RMSE of the setup estimates was 0.8±0.3 (mean value ± standard deviation for the phantom data and 0.3±0.3 for the patient data, respectively, by applying the two methodologies. Furthermore, statistical analysis by means of the Wilcoxon nonparametric signed test showed that the results that were obtained by the two methods did not differ significantly (P value >0.05.

  2. Commissioning of a MOSFET in-vivo patient dose verification system

    Jenetsky, G.O.; Brown, R.L.

    2004-01-01

    Full text: TLD dosimetry has long been used for in-vivo measurements in estimating absorbed dose to critical structures on patients. Preparing TLDs for measurement, and then obtaining the results is a time consuming process taking many hours. The Thomson-Neilson 'MOSFET 20' (Metal Oxide Semiconducting Field Effect Transistor) dose assessment system, allows for in-vivo measurements (preparation and results) within minutes. Before being used clinically for dose verification, the MOSFETs were tested against the manufacturer's technical specifications, and compared with results from TLDs measured under controlled experiments and patient measurements. Standard sensitivity MOSFETs (TN-502RD) were used with the bias supply set to High sensitivity range. MOSFETs were tested for linearity (5-100cGy) and their calibration factors obtained for all energies (6MV, 18MV, 6MeV, 12MeV, 16MeV, 20MeV) using the method described by Ramani. MOSFETs and TLDs were exposed to a 6MV beam for 50MU at various depths (RW3 solid water phantom) and field sizes and compared to results taken with an ion chamber. Measurements using both systems were also taken at beam edge and 5mm and 10mm out of the field. Eleven patients, who had lens dose assessment requests were measured with both TLDs and MOSFETs and a paired t-test was performed on the results. On two patients, multiple (nine and four) MOSFET measurements were taken and the range of results compared to the range obtained from the TLDs. MOSFET linearity obtained co-efficients of R 2 ≥ 0.996 for all energies, this compared to R 2 ≥ 0.996 recorded by both Ramani and Chaung. The y-intercept values varied from 0 to -2.0mV. Greatest variation between calibration factors, measured for each energy, was 7.5%, this is substantially greater than 3.8% quoted by the manufacturer. For the measurements taken at varying depths and field sizes both TLDs and MOSFETs agreed with the ion chamber results ±IcGy. Measurements taken at beam edge varied ±6c

  3. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  4. Preliminary Validation and Verification Plan for CAREM Reactor Protection System; Modelo de Plan Preliminar de Validacion y Verificacion para el Sistema de Proteccion del Reactor CAREM

    Fittipaldi, Ana; Felix, Maciel [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)

    2000-07-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan.

  5. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  6. Verification of the burn-up of spent fuel assemblies by means of the Consulha containment/surveillance system

    Daniel, G.; Gourlez, P.

    1991-01-01

    CONSULHA is a containment/surveillance system which has been developed as part of the French Support Programme for the IAEA Safeguards in cooperation with EURATOM and was designed to meet the IAEA EURATOM requirements for the verification of nuclear materials. This system will make it possible to count movements and verify irradiation of spent fuel assemblies in industrial facilities such as reprocessing plants and nuclear reactors

  7. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  8. VALGALI: VERIFICATION AND VALIDATION TOOL FOR THE LIBRARIES PRODUCED BY THE GALILEE SYSTEM

    Mengelle, S.

    2011-01-01

    Full text: In this paper we present VALGALI the verification and validation tool for the libraries produced by the nuclear data processing system GALILEE. The aim of this system is to provide libraries with consistent physical data for various application codes (the deterministic transport code APOLLO2, the Monte Carlo transport code TRRIPOLI-4, the depletion code DARWIN, ...). For each library, are the data stored at the good place with the good format and so one. Are the libraries used by the various codes consistent. What is the physical quality of the cross sections and data present in the libraries. These three types of tests correspond to the classic stages of VandV. The great strength of VALGALI is to be generic and not dedicated to one application code Consequently, it is based on a common physical validation database which coverage is regularly increased. For all these test cases, the input data are declined for each relevant application code Moreover it can exist specific test case for each application code: At the present, VALGALI an check and validate the libraries of APOLLO2 and TRIPOLI4, but in the near future VALGALI wil also treat the libraries of DARWIN.

  9. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  10. IMPACTS OF TIMBER LEGALITY VERIFICATION SYSTEM IMPLEMENTATION ON THE SUSTAINABILITY OF TIMBER INDUSTRY AND PRIVATE FOREST

    Elvida Yosefi Suryandari

    2017-04-01

    Full Text Available International market requires producers to proof the legality of their wood products to address the issues of illegal logging and illegal trade. Timber Legality Verification System (TLVS has been prepared by the Government of Indonesia that covering the upstream and downstream wood industries. This paper aims to evaluate gaps in the implementation of TLVS policy and its impact on the sustainability of timber industry. This study was using gap, descriptive and costs-structure analyzes. The study was conducted in three provinces, namely: DKI Jakarta, West Java and D.I. Yogyakarta. Research found that the effectiveness of the TLVS implementation was low due to relatively rapid policy changes. This situation became disincetive for investments in timber business. Private sector perceived that TLVS policy should be applied in the upstream of timber business. Hence, the industry and market in the downstream have not been fully support to this system. Furthermore, TLVS policy implementation was considered ineffective by timber industry as well as private forest managers, especially by micro industry and smallholder private forests. This situation threatened the sustainability of timber industry and private forests. Therefore, Institutions should be strengthened in order to improve the quality of human resources and the competitiveness of products.

  11. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  12. Clinical evaluation and verification of the hyperthermia treatment planning system hyperplan

    Gellermann, Johanna; Wust, Peter; Stalling, Dether; Seebass, Martin; Nadobny, Jacek; Beck, Rudolf; Hege, Hans-Christian; Deuflhard, Peter; Felix, Roland

    2000-01-01

    Purpose: A prototype of the hyperthermia treatment planning system (HTPS) HyperPlan for the SIGMA-60 applicator (BSD Medical Corp., Salt Lake City, Utah, USA) has been evaluated with respect to clinical practicability and correctness. Materials and Methods: HyperPlan modules extract tissue boundaries from computed tomography (CT) images to generate regular and tetrahedral grids as patient models, to calculate electric field (E-field) distributions, and to visualize three-dimensional data sets. The finite difference time-domain (FDTD) method is applied to calculate the specific absorption rate (SAR) inside the patient. Temperature distributions are calculated by a finite-element code and can be optimized. HyperPlan was tested on 6 patients with pelvic tumors. For verification, measured SAR values were compared with calculated SAR values. Furthermore, intracorporeal E-field scans were performed and compared with calculated profiles. Results: The HTPS can be applied under clinical conditions. Measured absolute SAR (in W/kg), as well as relative E-field scans, correlated well with calculated values (±20%) using the contour-based FDTD method. Values calculated by applying the FDTD method directly on the voxel (CT) grid, were less well correlated with measured data. Conclusion: The HyperPlan system proved to be clinically feasible, and the results were quantitatively and qualitatively verified for the contour-based FDTD method

  13. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  14. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  15. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  16. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  17. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  18. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  19. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  20. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  1. An automatic dose verification system for adaptive radiotherapy for helical tomotherapy

    Mo, Xiaohu; Chen, Mingli; Parnell, Donald; Olivera, Gustavo; Galmarini, Daniel; Lu, Weiguo

    2014-01-01

    dose verification system that quantifies treatment doses, and provides necessary information for adaptive planning without impeding clinical workflows.

  2. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  3. Development of prompt gamma measurement system for in vivo proton beam range verification

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  4. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  5. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  6. Integration of SPICE with TEK LV500 ASIC Design Verification System

    A. Srivastava

    1996-01-01

    Full Text Available The present work involves integration of the simulation stage of design of a VLSI circuit and its testing stage. The SPICE simulator, TEK LV500 ASIC Design Verification System, and TekWaves, a test program generator for LV500, were integrated. A software interface in ‘C’ language in UNIX ‘solaris 1.x’ environment has been developed between SPICE and the testing tools (TekWAVES and LV500. The function of the software interface developed is multifold. It takes input from either SPICE2G.6 or SPICE 3e.1. The output generated by the interface software can be given as an input to either TekWAVES or LV500. A graphical user interface has also been developed with OPENWlNDOWS using Xview tool kit on SUN workstation. As an example, a two phase clock generator circuit has been considered and usefulness of the software demonstrated. The interface software could be easily linked with VLSI design such as MAGIC layout editor.

  7. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  8. Enabling the usage of UML in the verification of railway systems: The DAM-rail approach

    Bernardi, S.; Flammini, F.; Marrone, S.; Mazzocca, N.; Merseguer, J.; Nardone, R.; Vittorini, V.

    2013-01-01

    The need for integration of model-based verification into industrial processes has produced several attempts to define Model-Driven solutions implementing a unifying approach to system development. A recent trend is to implement tool chains supporting the developer both in the design phase and V and V activities. In this Model-Driven context, specific domains require proper modelling approaches, especially for what concerns RAM (Reliability, Availability, Maintainability) analysis and fulfillment of international standards. This paper specifically addresses the definition of a Model-Driven approach for the evaluation of RAM attributes in railway applications to automatically generate formal models. For this aim we extend the MARTE-DAM UML profile with concepts related to maintenance aspects and service degradation, and show that the MARTE-DAM framework can be successfully specialized for the railway domain. Model transformations are then defined to generate Repairable Fault Tree and Bayesian Network models from MARTE-DAM specifications. The whole process is applied to the railway domain in two different availability studies

  9. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  10. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  11. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  12. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  13. Verification of the computational dosimetry system in JAERI (JCDS) for boron neutron capture therapy

    Kumada, H; Yamamoto, K; Matsumura, A; Yamamoto, T; Nakagawa, Y; Nakai, K; Kageji, T

    2004-01-01

    Clinical trials for boron neutron capture therapy (BNCT) by using the medical irradiation facility installed in Japan Research Reactor No. 4 (JRR-4) at Japan Atomic Energy Research Institute (JAERI) have been performed since 1999. To carry out the BNCT procedure based on proper treatment planning and its precise implementation, the JAERI computational dosimetry system (JCDS) which is applicable to dose planning has been developed in JAERI. The aim of this study was to verify the performance of JCDS. The experimental data with a cylindrical water phantom were compared with the calculation results using JCDS. Data of measurements obtained from IOBNCT cases at JRR-4 were also compared with retrospective evaluation data with JCDS. In comparison with phantom experiments, the calculations and the measurements for thermal neutron flux and gamma-ray dose were in a good agreement, except at the surface of the phantom. Against the measurements of clinical cases, the discrepancy of JCDS's calculations was approximately 10%. These basic and clinical verifications demonstrated that JCDS has enough performance for the BNCT dosimetry. Further investigations are recommended for precise dose distribution and faster calculation environment

  14. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  15. Advanced verification topics

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  16. Quantum Mechanics and locality in the K{sup 0} K-bar{sup 0} system experimental verification possibilities

    Muller, A.

    1994-11-01

    It is shown that elementary Quantum Mechanics, applied to the K{sup 0} K-bar{sup 0} system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K{sup 0} K-bar{sup 0} experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs.

  17. Accelerating SystemVerilog UVM Based VIP to Improve Methodology for Verification of Image Signal Processing Designs Using HW Emulator

    Jain, Abhishek; Gupta, Piyush Kumar; Gupta, Dr. Hima; Dhar, Sachish

    2014-01-01

    In this paper we present the development of Acceleratable UVCs from standard UVCs in SystemVerilog and their usage in UVM based Verification Environment of Image Signal Processing designs to increase run time performance. This paper covers development of Acceleratable UVCs from standard UVCs for internal control and data buses of ST imaging group by partitioning of transaction-level components and cycle-accurate signal-level components between the software simulator and hardware accelerator r...

  18. System for verification in situ of current transformers in high voltage substations; Sistema para verificacao in situ de transformadores de corrente em substacoes de alta tensao

    Mendonca, Pedro Henrique; Costa, Marcelo M. da; Dahlke, Diogo B.; Ikeda, Minoru [LACTEC - Instituto de Tecnologia para o Desenvolvimento, Curitiba, PR (Brazil)], Emails: pedro.henrique@lactec.org.br, arinos@lactec.org.br, diogo@lactec.org.br, minoru@lactec.org.br, Celso.melo@copel.com; Carvalho, Joao Claudio D. de [ELETRONORTE, Belem, PR (Brazil)], E-mail: marcelo.melo@eln.gov.br; Teixeira Junior, Jose Arinos [ELETROSUL, Florianopolis, SC (Brazil)], E-mail: jclaudio@eletrosul.gov.br; Melo, Celso F. [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)], E-mail: Celso.melo@copel.com

    2009-07-01

    This work presents an alternative proposal to the execute the calibration of conventional current transformer at the field, using a verification system composed by a optical current transformer as a reference standard, able to installation in extra high voltage bars.

  19. Clinical commissioning of an in vivo range verification system for prostate cancer treatment with anterior and anterior oblique proton beams

    Hoesl, M.; Deepak, S.; Moteabbed, M.; Jassens, G.; Orban, J.; Park, Y. K.; Parodi, K.; Bentefour, E. H.; Lu, H. M.

    2016-04-01

    The purpose of this work is the clinical commissioning of a recently developed in vivo range verification system (IRVS) for treatment of prostate cancer by anterior and anterior oblique proton beams. The IRVS is designed to perform a complete workflow for pre-treatment range verification and adjustment. It contains specifically designed dosimetry and electronic hardware and a specific software for workflow control with database connection to the treatment and imaging systems. An essential part of the IRVS system is an array of Si-diode detectors, designed to be mounted to the endorectal water balloon routinely used for prostate immobilization. The diodes can measure dose rate as function of time from which the water equivalent path length (WEPL) and the dose received are extracted. The former is used for pre-treatment beam range verification and correction, if necessary, while the latter is to monitor the dose delivered to patient rectum during the treatment and serves as an additional verification. The entire IRVS workflow was tested for anterior and 30 degree inclined proton beam in both solid water and anthropomorphic pelvic phantoms, with the measured WEPL and rectal doses compared to the treatment plan. Gafchromic films were also used for measurement of the rectal dose and compared to IRVS results. The WEPL measurement accuracy was in the order of 1 mm and after beam range correction, the dose received by the rectal wall were 1.6% and 0.4% from treatment planning, respectively, for the anterior and anterior oblique field. We believe the implementation of IRVS would make the treatment of prostate with anterior proton beams more accurate and reliable.

  20. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC registered films. A semiquantitative comparison

    Geyer, P.; Blank, H.; Alheit, H.

    2006-01-01

    Background and Purpose: the suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. Material and Methods: the comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC registered film as well, using the EC-L registered cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. Results: in general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. Conclusion: the ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures. (orig.)

  1. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  2. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  3. FMCT verification: Case studies

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  4. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  5. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States); Watkins, W

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  6. Treatment verification system in radiotherapy using a digital portal imaging device. Comparison with screen/film systems

    Nakata, Manabu; Komai, Yoshinori; Okada, Takashi; Fukumoto, Satoshi; Chadani, Kazuma; Nohara, Hiroki; Kazusa, Chudou.

    1994-01-01

    A digital portal imaging (DPI) system for megavoltage photon beams was installed recently in our department. The purpose of this study is to evaluate the image quality of this system. We have analyzed the following properties of the system; relationship between measured dose-rate and pixel values of the DPI, spatial resolution, detectability of low-contrast objects and setup errors. The results were compared with those of conventional screen-film systems. As a result, the relationship between the measured dose-rate and the pixel value of the DPI was found to be linear in the dose-rate range between 100 and 400 cGy/min. Spatial resolution was 1.25 and 0.5 mm for the DPI and the screen-film systems, respectively. The slope of the contrast-detail curves differed between the DPI and the screen-film systems, the contrast thresholds were 0.6 and 0.3% for the DPI and the screen-film systems, respectively. The detectability of a setup error of 1 mm and 2 mm for the DPI was lower than that by the screen-film systems, although the difference was not very significant. In conclusion, the image quality of the DPI at present time is slightly inferior to the conventional screen-film systems. However, notable advantages of the DPI system are that any positional changes in patients during irradiation can be detected very quickly, and that quantitative analysis of the setup variation can be obtained. The image quality of the DPI will be improved as the technology regarding advances. Therefore, this verification system using the DPI device, is expected to be used for clinical radiation therapy in the future. (author)

  7. Comparison of monitor units calculated by radiotherapy treatment planning system and an independent monitor unit verification software.

    Sellakumar, P; Arun, C; Sanjay, S S; Ramesh, S B

    2011-01-01

    In radiation therapy, the monitor units (MU) needed to deliver a treatment plan are calculated by treatment planning systems (TPS). The essential part of quality assurance is to verify the MU with independent monitor unit calculation to correct any potential errors prior to the start of treatment. In this study, we have compared the MU calculated by TPS and by independent MU verification software. The MU verification software was commissioned and tested for the data integrity to ensure that the correct beam data was considered for MU calculations. The accuracy of the calculations was tested by creating a series of test plans and comparing them with ion chamber measurements. The results show that there is good agreement between the two. The MU difference (MUdiff) between the monitor unit calculations of TPS and independent MU verification system was calculated for 623 fields from 245 patients and was analyzed by treatment site for head & neck, thorax, breast, abdomen and pelvis. The mean MUdiff of -0.838% with a standard deviation of 3.04% was observed for all 623 fields. The site specific standard deviation of MUdiff was as follows: abdomen and pelvis (<1.75%), head & neck (2.5%), thorax (2.32%) and breast (6.01%). The disparities were analyzed and different correction methods were used to reduce the disparity. © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  9. In situ object counting system (ISOCSi3TM) technique: A cost-effective tool for NDA verification in IAEA Safeguards

    Nizhnik, V.; Belian, A.; Shephard, A.; Lebrun, A.

    2011-01-01

    Nuclear material measurements using the ISOCS technique are playing an increasing role in IAEA verification activities. The ISOCS capabilities include: a high sensitivity to the presence of U and Pu; the ability to detect very small amounts of material; and the ability to measure items of different shapes and sizes. In addition, the numerical absolute efficiency calibration of a germanium detector used in the technique does not require any calibration standards or reference materials. The ISOCS modelling software performs an absolute efficiency calibration for items with various container shapes, container wall materials, material compositions, material fill-heights, U/Pu weight fractions and even heterogeneously distributed emitting materials. In a number of cases, some key parameters, such as the matrix density and U/Pu weight fraction, can be determined in addition to the emitting material mass and isotopic composition. These capabilities provide a verification solution suitable for a majority of cases where quantitative and isotopic analysis should be performed. Taking into account these advantages, the technique becomes a cost-effective solution for nuclear material non-destructive assay (NDA) verification. At present, the IAEA uses the ISOCS for a wide range of applications including the quantitative analysis of U scrap materials, U/Pu contaminated solid wastes, U fuel elements, U hold-up materials. Additionally, the ISOCS is also applied to some specific verification cases such as the measurement of PuBe neutron sources and the quantification of fission products in solid wastes. In reprocessing facilities with U/Pu waste compaction or facilities with item re-batching, the continuity-of-knowledge can be assured by applying either video surveillance systems together with seals (requiring attaching/detaching and verification activities for each seal) or verification of operator declarations using quantitative measurements for items selected on a random basis

  10. Runtime Instrumentation of SystemC/TLM2 Interfaces for Fault Tolerance Requirements Verification in Software Cosimulation

    Antonio da Silva

    2014-01-01

    Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.

  11. TU-FG-BRB-05: A 3 Dimensional Prompt Gamma Imaging System for Range Verification in Proton Radiotherapy

    Draeger, E; Chen, H; Polf, J [University of Maryland School of Medicine, Baltimore, MD (United States); Mackin, D; Beddar, S [MD Anderson Cancer Center, Houston, TX (United States); Avery, S [University of Cape Town, Rondebosch (South Africa); Peterson, S

    2016-06-15

    Purpose: To report on the initial developments of a clinical 3-dimensional (3D) prompt gamma (PG) imaging system for proton radiotherapy range verification. Methods: The new imaging system under development consists of a prototype Compton camera to measure PG emission during proton beam irradiation and software to reconstruct, display, and analyze 3D images of the PG emission. For initial test of the system, PGs were measured with a prototype CC during a 200 cGy dose delivery with clinical proton pencil beams (ranging from 100 MeV – 200 MeV) to a water phantom. Measurements were also carried out with the CC placed 15 cm from the phantom for a full range 150 MeV pencil beam and with its range shifted by 2 mm. Reconstructed images of the PG emission were displayed by the clinical PG imaging software and compared to the dose distributions of the proton beams calculated by a commercial treatment planning system. Results: Measurements made with the new PG imaging system showed that a 3D image could be reconstructed from PGs measured during the delivery of 200 cGy of dose, and that shifts in the Bragg peak range of as little as 2 mm could be detected. Conclusion: Initial tests of a new PG imaging system show its potential to provide 3D imaging and range verification for proton radiotherapy. Based on these results, we have begun work to improve the system with the goal that images can be produced from delivery of as little as 20 cGy so that the system could be used for in-vivo proton beam range verification on a daily basis.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL AND PARTICULATE CONTAMINANTS IN DRINKING WATER : SEPARMATIC™ FLUID SYSTEMS DIATOMACEOUS EARTH PRESSURE TYPE FILTER SYSTEM MODEL 12P-2

    The verification test of the SeparmaticTM DE Pressure Type Filter System Model 12P-2 was conducted at the UNH Water Treatment Technology Assistance Center (WTTAC) in Durham, New Hampshire. The source water was finished water from the Arthur Rollins Treatment Plant that was pretr...

  13. Dosimetric verification of a dedicated 3D treatment planning system for episcleral plaque therapy

    Knutsen, Stig; Hafslund, Rune; Monge, Odd R.; Valen, Harald; Muren, Ludvig Paul; Rekstad, Bernt Louni; Krohn, Joergen; Dahl, Olav

    2001-01-01

    Purpose: Episcleral plaque therapy (EPT) is applied in the management of some malignant ocular tumors. A customized configuration of typically 4 to 20 radioactive seeds is fixed in a gold plaque, and the plaque is sutured to the scleral surface corresponding to the basis of the intraocular tumor, allowing for a localized radiation dose delivery to the tumor. Minimum target doses as high as 100 Gy are directed at malignant tumor sites close to critical normal tissues (e.g., optic disc and macula). Precise dosimetry is therefore fundamental for judging both the risk for normal tissue toxicity and tumor dose prescription. This paper describes the dosimetric verification of a commercially available dedicated treatment planning system (TPS) for EPT when realistic multiple-seed configurations are applied. Materials and Methods: The TPS Bebig Plaque Simulator is used to plan EPT at our institution. Relative dose distributions in a water phantom, including central axis depth dose and off-axis dose profiles for three different plaques, the University of Southern California (USC) No. 9 and the Collaborative Ocular Melanoma Study (COMS) 12-mm and 20-mm plaques, were measured with a diode detector. Each plaque was arranged with realistic multiple 125 I seed configurations. The measured dose distributions were compared to the corresponding dose profiles calculated with the TPS. All measurements were corrected for the angular sensitivity variation of the diode. Results: Single-seed dose distributions measured with our dosimetry setup agreed with previously published data within 3%. For the three multiple-seed plaque configurations, the measured and calculated dose distributions were in good agreement. For the central axis depth doses, the agreement was within 4%, whereas deviations up to 11% were observed in single points far off-axis. Conclusions: The Bebig Plaque Simulator is a reliable TPS for calculating relative dose distributions around realistic multiple 125 I seed

  14. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.

  15. USING PERFLUOROCARBON TRACERS FOR VERIFICATION OF CAP AND COVER SYSTEMS PERFORMANCE

    HEISER, J.; SULLIVAN, T.

    2001-01-01

    The Department of Energy (DOE) Environmental Management (EM) office has committed itself to an accelerated cleanup of its national facilities. The goal is to have much of the DOE legacy waste sites remediated by 2006. This includes closure of several sites (e.g., Rocky Flats and Fernald). With the increased focus on accelerated cleanup, there has been considerable concern about long-term stewardship issues in general, and verification and long-term monitoring (LTM) of caps and covers, in particular. Cap and cover systems (covers) are vital remedial options that will be extensively used in meeting these 2006 cleanup goals. Every buried waste site within the DOE complex will require some form of cover system. These covers are expected to last from 100 to 1000 years or more. The stakeholders can be expected to focus on system durability and sustained performance. DOE EM has set up a national committee of experts to develop a long-term capping (LTC) guidance document. Covers are subject to subsidence, erosion, desiccation, animal intrusion, plant root infiltration, etc., all of which will affect the overall performance of the cover. Very little is available in terms of long-term monitoring other than downstream groundwater or surface water monitoring. By its very nature, this can only indicate that failure of the cover system has already occurred and contaminants have been transported away from the site. This is unacceptable. Methods that indicate early cover failure (prior to contaminant release) or predict approaching cover failure are needed. The LTC committee has identified predictive monitoring technologies as a high priority need for DOE, both for new covers as well as existing covers. The same committee identified a Brookhaven National Laboratory (BNL) technology as one approach that may be capable of meeting the requirements for LTM. The Environmental Research and Technology Division (ERTD) at BNL developed a novel methodology for verifying and monitoring

  16. Defining an Inteligent Information System for Monitoring and Verification of Energy Management in Cities

    Tomsic, Z.; Gasic, I.; Lugaric, L.; Cacic, G.

    2011-01-01

    Improving the efficiency of energy consumption (EC) is a central theme of any energy policy. Improved energy efficiency (EE) meets three energy policy goals: security of supply, competitiveness and protection of the environment. Systematic energy management is a body of knowledge and skills based on an organizational structure that links people with assigned responsibilities, efficiency monitoring procedures and continuous measurement and improvement of energy efficiency. This body of knowledge must be supported by appropriate ICT for gathering, processing and disseminating data on EC, EE targets and information. Energy Management Information System - EMIS is a web application for monitoring and analysis of energy and water consumption in public buildings and represents inevitable tool for systematic energy management. EMIS software tool connects processes of gathering data on buildings and their energy consumption, monitoring consumption indicators, setting energy efficiency targets and reporting energy and water consumption savings. Project Intelligent Information System for Monitoring and Verification of Energy Management in Cities (ISEMIC) will distribute EMIS software tool in region (BiH, Slovenia and Serbia). This project also has a goal of improving a software system for utilizing EC measurements, both from smart meters and traditional measurement devices and subsequent data processing and analysis to facilitate, upgrade and eventually replace the currently used energy management system for public buildings in Croatia. ISEMIC will enable use of smart meters within an energy management for the first time in BiH, Slovenia and Serbia, along with an analytical part which enables intelligent estimation of energy consumption based on multiple criteria. EMIS/ISEMIC will enable: Continuous updating and maintenance of a database of information on buildings; Continuous entry and monitoring of consumption data for all energents and water in buildings; Calculation of

  17. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V ampersand V methodology for expert systems is presented based on three factors: (1) a system's judged need for V ampersand V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested

  18. Architecture Support for Runtime Integration and Verification of Component-based Systems of Systems

    Gonzalez, A.; Piel, E.; Gross, H.G.

    2008-01-01

    Preprint of paper published in: ASE 2008 - 23rd IEEE/ACM International Conference on Automated Software Engineering, 15-19 September 2008; doi:10.1109/ASEW.2008.4686292 Systems-of-Systems (SoS) represent a novel kind of system, for which runtime evolution is a key requirement, as components join and

  19. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  20. On the Use of Static Checking in the Verification of Interlocking Systems

    Haxthausen, Anne Elisabeth; Østergaard, Peter H.

    2016-01-01

    In the formal methods community, the correctness of interlocking tables is typically verified by model checking. This paper suggests to use a static checker for this purpose and it demonstrates for the RobustRailS verification tool set that the execution time and memory usage of its static checker...

  1. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    2013-01-25

    ... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...

  2. A Particle System for Safety Verification of Free Flight in Air Traffic

    Blom, H.A.P.; Krystul, J.; Bakker, G.J.

    2006-01-01

    Under free flight, an aircrew has both the freedom to select their trajectory and the responsibility of resolving conflicts with other aircraft. The general belief is that free flight can be made safe under low traffic conditions. Increasing traffic, however, raises safety verification issues. This

  3. Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems

    Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio

    2006-01-01

    This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops

  4. Procedure generation and verification

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  5. Guidelines for the verification and validation of expert system software and conventional software. Volume 1: Project summary. Final report

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (AI) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally

  6. Remaining Sites Verification Package for the 1607-D4 Septic System. Attachment to Waste Site Reclassification Form 2005-036

    Carlson, R.A.

    2006-01-01

    The 1607-D4 Septic System was a septic tank and tile field that received sanitary sewage from the 115-D/DR Gas Recirculation Facility. This septic system operated from 1944 to 1968. Decommissioning took place in 1985 and 1986 when all above-grade features were demolished and the tank backfilled. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River

  7. Camera selection for real-time in vivo radiation treatment verification systems using Cherenkov imaging.

    Andreozzi, Jacqueline M; Zhang, Rongxiao; Glaser, Adam K; Jarvis, Lesley A; Pogue, Brian W; Gladstone, David J

    2015-02-01

    To identify achievable camera performance and hardware needs in a clinical Cherenkov imaging system for real-time, in vivo monitoring of the surface beam profile on patients, as novel visual information, documentation, and possible treatment verification for clinicians. Complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), intensified charge-coupled device (ICCD), and electron multiplying-intensified charge coupled device (EM-ICCD) cameras were investigated to determine Cherenkov imaging performance in a clinical radiotherapy setting, with one emphasis on the maximum supportable frame rate. Where possible, the image intensifier was synchronized using a pulse signal from the Linac in order to image with room lighting conditions comparable to patient treatment scenarios. A solid water phantom irradiated with a 6 MV photon beam was imaged by the cameras to evaluate the maximum frame rate for adequate Cherenkov detection. Adequate detection was defined as an average electron count in the background-subtracted Cherenkov image region of interest in excess of 0.5% (327 counts) of the 16-bit maximum electron count value. Additionally, an ICCD and an EM-ICCD were each used clinically to image two patients undergoing whole-breast radiotherapy to compare clinical advantages and limitations of each system. Intensifier-coupled cameras were required for imaging Cherenkov emission on the phantom surface with ambient room lighting; standalone CMOS and CCD cameras were not viable. The EM-ICCD was able to collect images from a single Linac pulse delivering less than 0.05 cGy of dose at 30 frames/s (fps) and pixel resolution of 512 × 512, compared to an ICCD which was limited to 4.7 fps at 1024 × 1024 resolution. An intensifier with higher quantum efficiency at the entrance photocathode in the red wavelengths [30% quantum efficiency (QE) vs previous 19%] promises at least 8.6 fps at a resolution of 1024 × 1024 and lower monetary cost than the EM-ICCD. The

  8. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  9. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  10. Translating Activity Diagram from Duration Calculus for Modeling of Real-Time Systems and its Formal Verification using UPPAAL and DiVinE

    Muhammad Abdul Basit Ur Rehman

    2016-01-01

    Full Text Available The RTS (Real-Time Systems are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus implementaion based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost.

  11. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  12. Chemical/Biological Agent Resistance Test (CBART) Test Fixture System Verification and Analytical Monitioring System Development

    2011-03-15

    progress was made towards the proportional intergral derivative (PID) tuning. The CBART NRT analytical system was developed, moved, replumbed, and...efficacy, or applicability of the contents hereof. The use of trade names in this report does not constitute endorsement of any commercial product ...Office MFC mass flow controller MS mass spectrometer MSD mass selective detector NRT near real-time PID proportional intergral derivative

  13. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Samuel, D [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Testa, M; Park, Y [Massachusetts General Hospital, Boston, MA (United States); Schneider, R; Moteabbed, M [General Hospital, Boston, MA (United States); Janssens, G; Prieels, D [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Orban de Xivry, J [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Lu, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Bentefour, E

    2014-06-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.

  14. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Samuel, D; Testa, M; Park, Y; Schneider, R; Moteabbed, M; Janssens, G; Prieels, D; Orban de Xivry, J; Lu, H; Bentefour, E

    2014-01-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient

  15. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  16. Test/QA Plan for Verification of Cavity Ringdown Spectroscopy Systems for Ammonia Monitoring in Stack Gas

    The purpose of the cavity ringdown spectroscopy (CRDS) technology test and quality assurance plan is to specify procedures for a verification test applicable to commercial cavity ringdown spectroscopy technologies. The purpose of the verification test is to evaluate the performa...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  18. A Scalable Approach for Hardware Semiformal Verification

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  19. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    Yang, Jun Un; Jung, Kwang Sup; Park, Chang Gyu

    1992-08-01

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  20. Computerized information system for inventory-taking and verification at a nuclear fuel fabrication plant with closed production lines

    Bahm, W.; Brueckner, C.; Hartmann, G.

    1976-01-01

    By means of a model the use of electronic data processing is studied for preparing inventory listings and for inventory verification in a fabrication plant for Pu-U mixed-oxide fuel pins. It is postulated that interruptions in operation should be avoided as much as possible. Closed-Line production is assumed so that access to nuclear material calls for special withdrawal via locks. The production line is subdivided into sections with measuring points placed in between to record the nuclear material flow. The measured results are fed to a central data acquisition and reporting system capable of calculating on-line from these results the book inventories present in the individual sections. Inventory-taking and verification are carried out simultaneously in the sections of the production line using the EDP system. The production is not interrupted for this purpose. The production stream is tagged prior to reaching a section to be measured and is subsequently measured when entering the respective section until the tag has reached the end of the section. The measurement can be verified by inspectors. Movements of nuclear materials in and from other plant areas such as the storage area are likewise fed into the central data processing system so that inventory lists can be recalled at any moment. By this means the inventory can be taken quickly and at any time. The inventory is verified in the conventional way. (author)

  1. Reinforcing of QA/QC programs in radiotherapy departments in Croatia: Results of treatment planning system verification

    Jurković, Slaven; Švabić, Manda; Diklić, Ana; Smilović Radojčić, Đeni; Dundara, Dea [Clinic for Radiotherapy and Oncology, Physics Division, University Hospital Rijeka, Rijeka (Croatia); Kasabašić, Mladen; Ivković, Ana [Department for Radiotherapy and Oncology, University Hospital Osijek, Osijek (Croatia); Faj, Dario, E-mail: dariofaj@mefos.hr [Department of Physics, School of Medicine, University of Osijek, Osijek (Croatia)

    2013-04-01

    Implementation of advanced techniques in clinical practice can greatly improve the outcome of radiation therapy, but it also makes the process much more complex with a lot of room for errors. An important part of the quality assurance program is verification of treatment planning system (TPS). Dosimetric verifications in anthropomorphic phantom were performed in 4 centers where new systems were installed. A total of 14 tests for 2 photon energies and multigrid superposition algorithms were conducted using the CMS XiO TPS. Evaluation criteria as specified in the International Atomic Energy Agency Technical Reports Series (IAEA TRS) 430 were employed. Results of measurements are grouped according to the placement of the measuring point and the beam energy. The majority of differences between calculated and measured doses in the water-equivalent part of the phantom were in tolerance. Significantly more out-of-tolerance values were observed in “nonwater-equivalent” parts of the phantom, especially for higher-energy photon beams. This survey was done as a part of continuous effort to build up awareness of quality assurance/quality control (QA/QC) importance in the Croatian radiotherapy community. Understanding the limitations of different parts of the various systems used in radiation therapy can systematically improve quality as well.

  2. Verification of the active deformation compensation system of the LMT/GTM by end-to-end simulations

    Eisentraeger, Peter; Suess, Martin

    2000-07-01

    The 50 m LMT/GTM is exposed to the climatic conditions at 4,600 m height on Cerro La Negra, Mexico. For operating the telescope to the challenging requirements of its millimeter objective, an active approach for monitoring and compensating the structural deformations (Flexible Body Compensation FBC) is necessary. This system includes temperature sensors and strain gages for identifying large scale deformations of the reflector backup structure, a laser system for measuring the subreflector position, and an inclinometer system for measuring the deformations of the alidade. For compensating the monitored deformations, the telescope is equipped with additional actuators for active control of the main reflector surface and the subreflector position. The paper describes the verification of the active deformation system by finite element calculations and MATLAB simulations of the surface accuracy and the pointing including the servo under the operational wind and thermal conditions.

  3. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    Haxthausen, Anne Elisabeth

    2009-01-01

    in a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for automated...... elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting...

  4. Aspects of the design and verification of software for a computerized reactor protection system

    Voges, U.

    1976-01-01

    In contrary to hardware, software lasts forever. If software is considered to be correct, it remains correct all the time (except you make any changes to it). Therefore failure rates, MTBF, MTTR etc. cannot be used for software. The main effort has to be put on: 1) how to make reliable software, 2) how to prove software to be correct. The first part deals with the developmental stage, the specification, design and implementation of the software, the second part with the 'produced' software, its test and verification. (orig./RW) [de

  5. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  6. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  7. Analysis and Verification of Message Sequence Charts of Distributed Systems with the Help of Coloured Petri Nets

    S. A. Chernenok

    2014-01-01

    Full Text Available The standard language of message sequence charts MSC is intended to describe scenarios of object interaction. Due to their expressiveness and simplicity MSC diagrams are widely used in practice at all stages of system design and development. In particular, the MSC language is used for describing communication behavior in distributed systems and communication protocols. In this paper the method for analysis and verification of MSC and HMSC diagrams is considered. The method is based on the translation of (HMSC into coloured Petri nets. The translation algorithms cover most standard elements of the MSC including data concepts. Size estimates of the CPN which is the result of the translation are given. Properties of the resulting CPN are analyzed and verified by using the known system CPN Tools and the CPN verifier based on the known tool SPIN. The translation method has been demonstrated by the example.

  8. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  9. Software verification and validation methodology for advanced digital reactor protection system using diverse dual processors to prevent common mode failure

    Son, Ki Chang; Shin, Hyun Kook; Lee, Nam Hoon; Baek, Seung Min; Kim, Hang Bae

    2001-01-01

    The Advanced Digital Reactor Protection System (ADRPS) with diverse dual processors is being developed by the National Research Lab of KOPEC for ADRPS development. One of the ADRPS goals is to develop digital Plant Protection System (PPS) free of Common Mode Failure (CMF). To prevent CMF, the principle of diversity is applied to both hardware design and software design. For the hardware diversity, two different types of CPUs are used for Bistable Processor and Local Coincidence Logic Processor. The VME based Single Board Computers (SBC) are used for the CPU hardware platforms. The QNX Operating System (OS) and the VxWorks OS are used for software diversity. Rigorous Software Verification and Validation (V and V) is also required to prevent CMF. In this paper, software V and V methodology for the ADRPS is described to enhance the ADRPS software reliability and to assure high quality of the ADRPS software

  10. Development of an automated testing system for verification and validation of nuclear data

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  11. Data Collection Guidelines for Consistent Evaluation of Data from Verification and Monitoring Safeguard Systems

    Castleberry, K.; Lenarduzzi, R.; Whitaker, M.

    1999-01-01

    One of the several activities the International Atomic Energy Agency (IAEA) inspectors perform in the verification process of Safeguard operations is the review and correlation of data from different sources. This process is often complex due to the different forms in which the data is presented. This paper describes some of the elements that are necessary to create a ''standardized'' structure for the verification of data. When properly collected and formatted, data can be analyzed with off-the shelf software applications using customized macros to automate the commands for the desired analysis. The standardized-data collection methodology is based on instrumentation guidelines as well as data structure elements, such as verifiable timing of data entry, automated data logging, identification codes, and others. The identification codes are used to associate data items with their sources and to correlate them with items from other data logging activities. The addition of predefined parameter ranges allows automated evaluation with the capability to provide a data summary, a cross-index of all data related to a specific event. Instances of actual databases are used as examples. The data collection guidelines described in this paper facilitate the use of data from a variety of instrumentation platforms and also allow the instrumentation itself to be more easily applied in subsequent monitoring applications

  12. Nuclear disarmament verification

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  13. Central axis dose verification in patients treated with total body irradiation of photons using a Computed Radiography system

    Rubio Rivero, A.; Caballero Pinelo, R.; Gonzalez Perez, Y.

    2015-01-01

    To propose and evaluate a method for the central axis dose verification in patients treated with total body irradiation (TBI) of photons using images obtained through a Computed Radiography (CR) system. It was used the Computed Radiography (Fuji) portal imaging cassette readings and correlate with measured of absorbed dose in water using 10 x 10 irradiation fields with ionization chamber in the 60 Co equipment. The analytical and graphic expression is obtained through software 'Origin8', the TBI patient portal verification images were processed using software ImageJ, to obtain the patient dose. To validate the results, the absorbed dose in RW3 models was measured with ionization chamber with different thickness, simulating TBI real conditions. Finally it was performed a retrospective study over the last 4 years obtaining the patients absorbed dose based on the reading in the image and comparing with the planned dose. The analytical equation obtained permits estimate the absorbed dose using image pixel value and the dose measured with ionization chamber and correlated with patient clinical records. Those results are compared with reported evidence obtaining a difference less than 02%, the 3 methods were compared and the results are within 10%. (Author)

  14. SU-E-T-442: Geometric Calibration and Verification of a GammaPod Breast SBRT System

    Yu, C [Univ Maryland School of Medicine, Baltimore, MD (United States); Xcision Medical Systems, Columbia, MD (United States); Niu, Y; Maton, P; Hoban, P [Xcision Medical Systems, Columbia, MD (United States); Mutaf, Y [Univ Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: The first GammaPod™ unit for prone stereotactic treatment of early stage breast cancer has recently been installed and calibrated. Thirty-six rotating circular Co-60 beams focus dose at an isocenter that traverses throughout a breast target via continuous motion of the treatment table. The breast is immobilized and localized using a vacuum-assisted stereotactic cup system that is fixed to the table during treatment. Here we report on system calibration and on verification of geometric and dosimetric accuracy. Methods: Spatial calibration involves setting the origin of each table translational axis within the treatment control system such that the relationship between beam isocenter and table geometry is consistent with that assumed by the treatment planning system. A polyethylene QA breast phantom inserted into an aperture in the patient couch is used for calibration and verification. The comparison is performed via fiducial-based registration of measured single-isocenter dose profiles (radiochromic film) with kernel dose profiles. With the table calibrations applied, measured relative dose distributions were compared with TPS calculations for single-isocenter and dynamic (many-isocenter) treatment plans. Further, table motion accuracy and linearity was tested via comparison of planned control points with independent encoder readouts. Results: After table calibration, comparison of measured and calculated single-isocenter dose profiles show agreement to within 0.5 mm for each axis. Gamma analysis of measured vs calculated profiles with 3%/2mm criteria yields a passing rate of >99% and >98% for single-isocenter and dynamic plans respectively. This also validates the relative dose distributions produced by the TPS. Measured table motion accuracy was within 0.05 mm for all translational axes. Conclusion: GammaPod table coordinate calibration is a straightforward process that yields very good agreement between planned and measured relative dose distributions

  15. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  16. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  17. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  18. Main control system verification and validation of NPP digital I and C system based on engineering simulator

    Lin Meng; Hou Dong; Liu Pengfei; Yang Zongwei; Yang Yanhua

    2010-01-01

    Full-scope digital instrumentation and controls system (I and C) technique is being introduced in Chinese new constructed Nuclear Power Plant (NPP), which mainly includes three parts: control system, reactor protection system and engineered safety feature actuation system. For example, SIEMENS TELEPERM XP and XS distributed control system (DCS) have been used in Ling Ao Phase II NPP, which is located in Guangdong province, China. This is the first NPP project in China that Chinese engineers are fully responsible for all the configuration of actual analog and logic diagram, although experience in NPP full-scope digital I and C is very limited. For the safety, it has to be made sure that configuration is right and control functions can be accomplished before the phase of real plant testing on reactor. Therefore, primary verification and validation (V and V) of I and C needs to be carried out. Except the common and basic way, i.e. checking the diagram configuration one by one according to original design, NPP engineering simulator is applied as another effective approach of V and V. For this purpose, a virtual NPP thermal-hydraulic model is established as a basis according to Ling Ao Phase II NPP design, and the NPP simulation tools can provide plant operation parameters to DCS, accept control signal from I and C and give response. During the test, one set of data acquisition equipments are used to build a connection between the engineering simulator (software) and SIEMENS DCS I/O cabinet (hardware). In this emulation, original diagram configuration in DCS and field hardware structures are kept unchanged. In this way, firstly judging whether there are some problems by observing the input and output of DCS without knowing the internal configuration. Then secondly, problems can be found and corrected by understanding and checking the exact and complex configuration in detail. At last, the correctness and functionality of the control system are verified. This method is

  19. A fast online hit verification method for the single ion hit system at GSI

    Du, G.; Fischer, B.; Barberet, P.; Heiss, M.

    2006-01-01

    For a single ion hit facility built to irradiate specific targets inside biological cells, it is necessary to prove that the ions hit the selected targets reliably because the ion hits usually cannot be seen. That ability is traditionally tested either indirectly by aiming at pre-etched tracks in a nuclear track detector or directly by making the ion tracks inside cells visible using a stain coupled to special proteins produced in response to ion hits. However, both methods are time consuming and hits can be verified only after the experiment. This means that targeting errors in the experiment cannot be corrected during the experiment. Therefore, we have developed a fast online hit verification method that measures the targeting accuracy electronically with a spatial resolution of ±1 μm before cell irradiation takes place. (authors)

  20. Verification of Ceramic Structures

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  1. HDL to verification logic translator

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  2. Methodologies for verification and validation of expert systems as a function of component, criticality and life-cycle phase

    Miller, L.

    1992-01-01

    The review of verification and validation (V and V) methods presented here is based on results of the initial two tasks of a contract with the US Nuclear Regulatory Commission and the Electric Power Research Institute to Develop and Document Guidelines for Verifying and Validating Expert Systems. The first task was to review the applicability of conventional software techniques to expert systems; the second was to directly survey V and V practices associated with development of expert systems. Subsequent tasks will focus on selecting, synthesizing or developing V and V methods appropriate for the overall system, for specific expert systems components, and for different phases of the life-cycle. In addition, final guidelines will most likely be developed for each of three levels of expert systems: safety-related (systems whose functions directly relate to system safety, so-called safety-critical systems), important-to-safety (systems which support the critical safety functions), and non-safety (systems which are unrelated to safety functions). For the present purposes of categorizing and discussing various types of V and V methods, the authors simplify the life-cycle and consider only two aspects - systems validation phase. The authors identified a number of techniques for the first, combined, phase and two general classes of V and V techniques for the latter phase: static testing techniques, which do not involve execution of the system code, and dynamic testing techniques, which do. In the next two sections the author reviews first the applicability to expert systems of conventional V and V techniques and, second, the techniques expert system developers actually use. In the last section the authors make some general observations

  3. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  4. A usability review of a model checker VIS for the verification of NPP I and C system safety software

    Son, H. S.; Kwon, K. C.

    2002-01-01

    This paper discusses the usability of a model checker VIS in the verification of safety software of NPP I and C systems. The software development environment exemplified in this paper is for PLC and ESF-CCS which are being developed in KNICS project. In this environment, STATEMATE is used in requirement analysis and design phases. PLC is expected to be implemented using C language and an assembly language because it has many interfaces with hardware like CPU, I/O devices, communication devices. ESF-CCS is supposed to be developed in terms of PLC programming languages which are defined in IEC 61131-3 standard. In this case, VIS proved to be very useful through the review. We are also able to expect greater usability of VIS if we further develop the techniques for code abstraction and automatic translation from code to verilog, which is the input of VIS

  5. The verification methodologies for a software modeling of Engineered Safety Features- Component Control System (ESF-CCS)

    Lee, Young-Jun; Cheon, Se-Woo; Cha, Kyung-Ho; Park, Gee-Yong; Kwon, Kee-Choon

    2007-01-01

    The safety of a software is not guaranteed through a simple testing of the software. The testing reviews only the static functions of a software. The behavior, dynamic state of a software is not reviewed by a software testing. The Ariane5 rocket accident and the failure of the Virtual Case File Project are determined by a software fault. Although this software was tested thoroughly, the potential errors existed internally. There are a lot of methods to solve these problems. One of the methods is a formal methodology. It describes the software requirements as a formal specification during a software life cycle and verifies a specified design. This paper suggests the methods which verify the design to be described as a formal specification. We adapt these methods to the software of a ESF-CCS (Engineered Safety Features-Component Control System) and use the SCADE (Safety Critical Application Development Environment) tool for adopting the suggested verification methods

  6. Guidelines for the verification and validation of expert system software and conventional software: User`s manual. Volume 7

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.

  7. Guidelines for the verification and validation of expert system software and conventional software: User's manual. Volume 7

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V ampersand V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V ampersand V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V ampersand V methods is most appropriate for those conditions. The V ampersand V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V ampersand V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately

  8. Development and Verification of a Mobile Shelter Assessment System "Rapid Assessment System of Evacuation Center Condition Featuring Gonryo and Miyagi (RASECC-GM)" for Major Disasters.

    Ishii, Tadashi; Nakayama, Masaharu; Abe, Michiaki; Takayama, Shin; Kamei, Takashi; Abe, Yoshiko; Yamadera, Jun; Amito, Koichiro; Morino, Kazuma

    2016-10-01

    Introduction There were 5,385 deceased and 710 missing in the Ishinomaki medical zone following the Great East Japan Earthquake that occurred in Japan on March 11, 2011. The Ishinomaki Zone Joint Relief Team (IZJRT) was formed to unify the relief teams of all organizations joining in support of the Ishinomaki area. The IZJRT expanded relief activity as they continued to manually collect and analyze assessments of essential information for maintaining health in all 328 shelters using a paper-type survey. However, the IZJRT spent an enormous amount of time and effort entering and analyzing these data because the work was vastly complex. Therefore, an assessment system must be developed that can tabulate shelter assessment data correctly and efficiently. The objective of this report was to describe the development and verification of a system to rapidly assess evacuation centers in preparation for the next major disaster. Report Based on experiences with the complex work during the disaster, software called the "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi" (RASECC-GM) was developed to enter, tabulate, and manage the shelter assessment data. Further, a verification test was conducted during a large-scale Self-Defense Force (SDF) training exercise to confirm its feasibility, usability, and accuracy. The RASECC-GM comprises three screens: (1) the "Data Entry screen," allowing for quick entry on tablet devices of 19 assessment items, including shelter administrator, living and sanitary conditions, and a tally of the injured and sick; (2) the "Relief Team/Shelter Management screen," for registering information on relief teams and shelters; and (3) the "Data Tabulation screen," which allows tabulation of the data entered for each shelter, as well as viewing and sorting from a disaster headquarters' computer. During the verification test, data of mock shelters entered online were tabulated quickly and accurately on a mock disaster

  9. Guidelines for the verification and validation of expert system software and conventional software. Volume 7, User's manual: Final report

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Reliable software is required for nuclear power industry applications. Verification and validation techniques applied during the software development process can help eliminate errors that could inhibit the proper operation of digital systems and cause availability and safety problems. Most of the techniques described in this report are valid for conventional software systems as well as for expert systems. The project resulted in a set of 16 V ampersand V guideline packages and 11 sets of procedures based on the class, development phase, and system component being tested. These guideline packages and procedures help a utility define the level of V ampersand V, which involves evaluating the complexity and type of software component along with the consequences of failure. In all, the project identified 153 V ampersand V techniques for conventional software systems and demonstrated their application to all aspects of expert systems except for the knowledge base, which requires specially developed tools. Each of these conventional techniques covers anywhere from 2-52 total types of conventional software defects, and each defect is covered by 21-50 V ampersand V techniques. The project also identified automated tools to Support V ampersand V activities

  10. A study on a systematic approach of verification and validation of a computerized procedure system: ImPRO

    Qin, Wei; Seong, Poong Hyun

    2003-01-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in NPP I and C system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of V and V of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested

  11. Methodology and tools for independent verification and validation of computerized I and C systems important to safety

    Lindner, A.; Miedl, H.

    1998-01-01

    Modular software based I and C systems are state-of-the-art in industrial automation. For I and C systems important to safety in nuclear power plants, software based systems are also more and more applied. According to existing national and international guidelines and standards, the assessment of these systems calls for appropriate test methods and tools. By use of tools quality of the assessment process should be improved and expense should be limited. The paper outlines the structure of the independent verification and validation (V and V) process of the Teleperm XS system and the lessons learnt from this process. Furthermore, tools are discussed used for V and V of the Teleperm XS software. The recently developed tool VALIDATOR, dedicated to V and V of the plant specific I and C functions is described in more detail. We consider V and V of the basic software components and the system software to be required only once, but the C source codes of the plant specific functional diagrams have to be checked for each application separately. The VALIDATOR is designed to perform this task. It gives evidence of compliance of the automatically generated C source codes with the graphical design of the functional diagrams in reasonable time and with acceptable costs. The working method, performance and results of the VALIDATOR are shown by means of an actual example. (author)

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: WATTS PREMIER M-SERIES M-15,000 REVERSE OSMOSIS TREATMENT SYSTEM

    Verification testing of the Watts Premier M-Series M-15,000 RO Treatment System was conducted over a 31-day period from April 26, 2004, through May 26, 2004. This test was conducted at the Coachella Valley Water District (CVWD) Well 7802 in Thermal, California. The source water...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF MOBILE SOURCE EMISSIONS CONTROL DEVICES/CLEAN DIESEL TECHNOLOGIES FUEL BORNE CATALYST WITH CLEANAIR SYSTEM'S DIESEL OXIDATION CATALYST

    The Environmental Technology Verification report discusses the technology and performance of the Fuel-Borne Catalyst with CleanAir System's Diesel Oxidation Catalyst manufactured by Clean Diesel Technologies, Inc. The technology is a fuel-borne catalyst used in ultra low sulfur d...

  14. Specification and Verification of Medical Monitoring System Using Petri-nets.

    Majma, Negar; Babamir, Seyed Morteza

    2014-07-01

    To monitor the patient behavior, data are collected from patient's body by a medical monitoring device so as to calculate the output using embedded software. Incorrect calculations may endanger the patient's life if the software fails to meet the patient's requirements. Accordingly, the veracity of the software behavior is a matter of concern in the medicine; moreover, the data collected from the patient's body are fuzzy. Some methods have already dealt with monitoring the medical monitoring devices; however, model based monitoring fuzzy computations of such devices have been addressed less. The present paper aims to present synthesizing a fuzzy Petri-net (FPN) model to verify behavior of a sample medical monitoring device called continuous infusion insulin (INS) because Petri-net (PN) is one of the formal and visual methods to verify the software's behavior. The device is worn by the diabetic patients and then the software calculates the INS dose and makes a decision for injection. The input and output of the infusion INS software are not crisp in the real world; therefore, we present them in fuzzy variables. Afterwards, we use FPN instead of clear PN to model the fuzzy variables. The paper follows three steps to synthesize an FPN to deal with verification of the infusion INS device: (1) Definition of fuzzy variables, (2) definition of fuzzy rules and (3) design of the FPN model to verify the software behavior.

  15. Performance verification and system parameter identification of spacecraft tape recorder control servo

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  16. OBSERVATION AND CONFIRMATION OF SIX STRONG-LENSING SYSTEMS IN THE DARK ENERGY SURVEY SCIENCE VERIFICATION DATA

    Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Kuropatkin, N.; Allam, S.; Finley, D. A.; Flaugher, B.; Gaitsch, H.; Merritt, K. W.; Helsby, J.; Amara, A.; Collett, T.; Caminha, G. B.; De Bom, C.; Da Pereira, M. Elidaiana S.; Desai, S.; Dúmet-Montoya, H.; Furlanetto, C.; Gill, M.

    2016-01-01

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ∼ 0.80–3.2 and in i -band surface brightness i SB ∼ 23–25 mag arcsec −2 (2″ aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc , which have ranges θ E ∼ 5″–9″ and M enc ∼ 8 × 10 12 to 6 × 10 13 M ⊙ , respectively.

  17. OBSERVATION AND CONFIRMATION OF SIX STRONG-LENSING SYSTEMS IN THE DARK ENERGY SURVEY SCIENCE VERIFICATION DATA

    Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Kuropatkin, N.; Allam, S.; Finley, D. A.; Flaugher, B.; Gaitsch, H.; Merritt, K. W. [Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Helsby, J. [Kavli Institute for Cosmological Physics, University of Chicago, Chicago, IL 60637 (United States); Amara, A. [Department of Physics, ETH Zurich, Wolfgang-Pauli-Strasse 16, CH-8093 Zurich (Switzerland); Collett, T. [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, PO1 3FX (United Kingdom); Caminha, G. B.; De Bom, C.; Da Pereira, M. Elidaiana S. [ICRA, Centro Brasileiro de Pesquisas Físicas, Rua Dr. Xavier Sigaud 150, CEP 22290-180, Rio de Janeiro, RJ (Brazil); Desai, S. [Excellence Cluster Universe, Boltzmannstrasse 2, D-85748 Garching (Germany); Dúmet-Montoya, H. [Universidade Federal do Rio de Janeiro—Campus Macaé, Rua Aloísio Gomes da Silva, 50—Granja dos Cavaleiros, Cep: 27930-560, Macaé, RJ (Brazil); Furlanetto, C. [University of Nottingham, School of Physics and Astronomy, Nottingham NG7 2RD (United Kingdom); Gill, M., E-mail: nord@fnal.gov [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Collaboration: DES Collaboration; and others

    2016-08-10

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ∼ 0.80–3.2 and in i -band surface brightness i {sub SB} ∼ 23–25 mag arcsec{sup −2} (2″ aperture). For each of the six systems, we estimate the Einstein radius θ {sub E} and the enclosed mass M {sub enc}, which have ranges θ {sub E} ∼ 5″–9″ and M {sub enc} ∼ 8 × 10{sup 12} to 6 × 10{sup 13} M {sub ⊙}, respectively.

  18. Implementation of an RBF neural network on embedded systems: real-time face tracking and identity verification.

    Yang, Fan; Paindavoine, M

    2003-01-01

    This paper describes a real time vision system that allows us to localize faces in video sequences and verify their identity. These processes are image processing techniques based on the radial basis function (RBF) neural network approach. The robustness of this system has been evaluated quantitatively on eight video sequences. We have adapted our model for an application of face recognition using the Olivetti Research Laboratory (ORL), Cambridge, UK, database so as to compare the performance against other systems. We also describe three hardware implementations of our model on embedded systems based on the field programmable gate array (FPGA), zero instruction set computer (ZISC) chips, and digital signal processor (DSP) TMS320C62, respectively. We analyze the algorithm complexity and present results of hardware implementations in terms of the resources used and processing speed. The success rates of face tracking and identity verification are 92% (FPGA), 85% (ZISC), and 98.2% (DSP), respectively. For the three embedded systems, the processing speeds for images size of 288 /spl times/ 352 are 14 images/s, 25 images/s, and 4.8 images/s, respectively.

  19. Material integrity verification radar

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  20. Verification of Abbott 25-OH-vitamin D assay on the architect system

    Katrina Hutchinson

    2017-04-01

    Full Text Available Objectives: Analytical and clinical verification of both old and new generations of the Abbott total 25-hydroxyvitamin D (25OHD assays, and an examination of reference Intervals. Methods: Determination of between-run precision, and Deming comparison between patient sample results for 25OHD on the Abbott Architect, DiaSorin Liaison and AB SCIEX API 4000 (LC-MS/MS. Establishment of uncertainty of measurement for 25OHD Architect methods using old and new generations of the reagents, and estimation of reference interval in healthy Irish population. Results: For between-run precision the manufacturer claims 2.8% coefficients of variation (CVs of 2.8% and 4.6% for their high and low controls, respectively. Our instrument showed CVs between 4% and 6.2% for all levels of the controls on both generations of the Abbott reagents. The between-run uncertainties were 0.28 and 0.36, with expanded uncertainties 0.87 and 0.98 for the old and the new generations of reagent, respectively. The difference between all methods used for patients’ samples was within total allowable error, and the instruments produced clinically equivalent results. The results covered the medical decision points of 30, 40, 50 and 125 nmol/L. The reference interval for total 25OHD in our healthy Irish subjects was lower than recommended levels (24–111 nmol/L. Conclusion: In a clinical laboratory Abbott 25OHD immunoassays are a useful, rapid and accurate method for measuring total 25OHD. The new generation of the assay was confirmed to be reliable, accurate, and a good indicator for 25OHD measurement. More study is needed to establish reference intervals that correctly represent the healthy population in Ireland.