WorldWideScience

Sample records for attribute verification system

  1. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    Elmont, T.H.; Langner, Diana C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Modenov, A.

    2005-01-01

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  2. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements

    International Nuclear Information System (INIS)

    Budnikov, D.; Bulatov, M.; Jarikhine, I.; Lebedev, B.; Livke, A.; Modenov, A.; Morkin, A.; Razinkov, S.; Tsaregorodtsev, D.; Vlokh, A.; Yakovleva, S.; Elmont, T.H.; Langner, D.C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Luke, S.J.

    2005-01-01

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPEC PLUS . The neutron multiplicity counter is a three ring counter with 164 3 He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  3. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  4. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  5. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  6. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  7. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  8. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  9. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  10. Sasset: Attribute verification of accountancy records in nuclear safeguards Pt.1: Theoretical basis

    International Nuclear Information System (INIS)

    Mueller, K.H.

    1980-01-01

    This study demonstrates the construction of a sampling plan to be followed by the inspector during the statical verification (in attribute mode) of the accountancy records of a nuclear fuel fabrication plant. It presents the instructions defining inspector's tasks and guiding his actions; it permits to validate his decisions. Part I deals with the statistical fundaments of the verification technique in attribute mode. Part II is a manual for the corresponding computer code SASSET developed at JRC Ispra

  11. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  12. Enumeration Verification System (EVS)

    Data.gov (United States)

    Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...

  13. Vehicle usage verification system

    NARCIS (Netherlands)

    Scanlon, W.G.; McQuiston, Jonathan; Cotton, Simon L.

    2012-01-01

    EN)A computer-implemented system for verifying vehicle usage comprising a server capable of communication with a plurality of clients across a communications network. Each client is provided in a respective vehicle and with a respective global positioning system (GPS) by which the client can

  14. Central Verification System

    Data.gov (United States)

    US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...

  15. Attribute measurement systems prototypes and equipment in the United States

    International Nuclear Information System (INIS)

    Langner, D.C.; Landry, R.P.; Hsue, S.-T.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Nicholas, N.J.; Whiteson, R.

    2001-01-01

    Since the fall of 1997, the United States has been developing prototypical attribute verification technology for potential use by the International Atomic Energy Agency (IAEA) under the Trilateral Initiative. The first attribute measurement equipment demonstration took place in December 1997 at the Lawrence Livermore National Laboratory. This demonstration led to a series of joint Russian Federatioin/US/IAEA technical discussions that focused on attribute measurement technology that could be applied to plutonium bearing items having classified characteristics. A first prototype attribute verification system with an information barrier was demonstrated at a Trilateral Technical Workshop in June 1999 at Los Alamos. This prototype nourished further fruitful discussions between the three parties that has in turn led to the documents discussed in a previous paper. Prototype development has continued in the US, under other initiatives, using an integrated approach that includes the Trilatleral Initiative. Specifically for the Trilateral Initiative, US development has turned to some peripheral equipment that would support verifications by the IAEA. This equipment includes an authentication tool for measurement systems with information barriers and in situ probes that would facilitate inspections by reducing the need to move material out of storage locations for reverification. In this paper, we will first summarize the development of attribute verification measurement system technology in the US and then report on the status of the development of other equipment to support the Trilateral Initiative.

  16. Attribute measurement equipment for the verification of plutonium in classified forms for the Trilateral Initiative

    International Nuclear Information System (INIS)

    Langner, D.G.; Hsue, S.-T.; Macarthur, D.W.

    2001-01-01

    Full text: A team of technical experts from the Russian Federation, the International Atomic Energy Agency (IAEA) and the United States have been working for almost five years on the development of a tool kit of instruments that could be used to verify plutonium-bearing items that have classified characteristics in nuclear weapons states. This suite of instruments is similar in many ways to standard safeguards equipment and includes high-resolution gamma-ray spectrometers, neutron multiplicity counters, gross neutron counters and gross gamma-ray detectors. In safeguards applications, this equipment is known to be robust, and authentication methods are well understood. This equipment is very intrusive, however, and a traditional safeguards application of such equipment for verification of materials with classified characteristics would reveal classified information to the inspector, Several enabling technologies have been or are being developed to facilitate the use of these trusted, but intrusive technologies. In this paper, these technologies will be described. One of the new technologies is called an Attribute Verification System with an Information Barrier Utilizing Neutron Multiplicity Counting and High-Resolution Gamma-Ray Spectrometry' or AVNG. The radiation measurement equipment, comprising a neutron multiplicity counter and high-resolution gamma-ray spectrometer, is standard safeguards-type equipment with information security features added. The information barrier is a combination of technical and procedural methods that protect classified information while allowing the inspector to have confidence that the measurement equipment is providing authentic results. The approach is to reduce the radiation data collected by the measurement equipment to a simple 'yes/no' result regarding attributes of the plutonium-bearing item. The 'yes/no' result is unclassified by design so that it can be shared with an inspector. The attributes that the Trilateral Initiative

  17. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  18. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  19. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  20. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  1. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  2. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  3. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  4. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  5. Attributes of system testing which promote cost-effectiveness

    International Nuclear Information System (INIS)

    Martin, L.C.

    1975-01-01

    A brief overview of conventional EMP testing activity examines attributes of overall systems tests which promote cost-effectiveness. The general framework represents an EMP-oriented systems test as a portion of a planned program to design, produce, and field system elements. As such, all so-called system tests should play appropriate cost-effective roles in this program, and the objective here is to disclose such roles. The intrinsic worth of such tests depends not only upon placing proper values on the outcomes, but also upon the possible eventual consequences of not doing tests. A relative worth measure is required. Attributes of EMP system testing over the range of potential activity which encompasses research and development, production, field handling, verification, evaluation, and others are reviewed and examined. Thus, the relative worth, in a cost-effective sense, is provided by relating such attributes to the overall program objectives so that values can be placed on the outcomes for tradeoff purposes

  6. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  7. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  8. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  9. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  10. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  11. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  13. Attribute-Based Digital Signature System

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, Muhammad; Petkovic, M.

    2011-01-01

    An attribute-based digital signature system comprises a signature generation unit (1) for signing a message (m) by generating a signature (s) based on a user secret key (SK) associated with a set of user attributes, wherein the signature generation unit (1) is arranged for combining the user secret

  14. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  15. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  16. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  17. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  18. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  19. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  20. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  1. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  2. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  3. Parametric Verification of Weighted Systems

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders

    2015-01-01

    are themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose...... a global update function that yields an assignment to each node in a PDG. For an iterative application of the function, we prove that a fixed point assignment to PDG nodes exists and the set of assignments constitutes a well-quasi ordering, thus ensuring that the fixed point assignment can be found after...

  4. Formal System Verification - Extension 2

    Science.gov (United States)

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  5. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  6. Experimental functional realization of attribute grammar system

    Directory of Open Access Journals (Sweden)

    I. Attali

    2002-07-01

    Full Text Available In this paper we present an experimental functional realization of attribute grammar(AG system for personal computers. For AG system functioning only Turbo Prolog compiler is required. The system functioning is based on a specially elaborated metalanguage for AG description, universal syntactic and semantic constructors. The AG system provides automatic generation of target compiler (syntax--oriented software using Turbo Prolog as object language.

  7. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  8. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  9. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  10. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  11. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  12. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  13. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  15. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  16. Entropy Measurement for Biometric Verification Systems.

    Science.gov (United States)

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  17. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  18. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  19. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  20. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  1. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  2. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  3. NPP Temelin instrumentation and control system upgrade and verification

    International Nuclear Information System (INIS)

    Ubra, O.; Petrlik, J.

    1998-01-01

    Two units of Ver 1000 type of the Czech nuclear power plant Temelin, which are under construction are being upgraded with the latest instrumentation and control system delivered by WEC. To confirm that the functional design of the new Reactor Control and Limitation System, Turbine Control System and Plant Control System are in compliance with the Czech customer requirements and that these requirements are compatible with NPP Temelin upgraded technology, the verification of the control systems has been performed. The method of transient analysis has been applied. Some details of the NPP Temelin Reactor Control and Limitation System verification are presented.(author)

  4. Multiple Authorities Attribute-Based Verification Mechanism for Blockchain Mircogrid Transactions

    Directory of Open Access Journals (Sweden)

    Sarmadullah Khan

    2018-05-01

    Full Text Available Recently, advancements in energy distribution models have fulfilled the needs of microgrids in finding a suitable energy distribution model between producer and consumer without the need of central controlling authority. Most of the energy distribution model deals with energy transactions and losses without considering the security aspects such as information tampering. The transaction data could be accessible online to keep track of the energy distribution between the consumer and producer (e.g., online payment records and supplier profiles. However this data is prone to modification and misuse if a consumer moves from one producer to other. Blockchain is considered to be one solution to allow users to exchange energy related data and keep track of it without exposing it to modification. In this paper, electrical transactions embedded in blockchain are validated using the signatures of multiple producers based on their assigned attributes. These signatures are verified and endorsed by the consumers satisfying those attributes without revealing any information. The public and private keys for these consumers are generated by the producers and endorsement procedure using these keys ensures that these consumers are authorized. This approach does not need any central authority. To resist against collision attacks, producers are given a secret pseudorandom function seed. The comparative analysis shows the efficiency of proposed approach over the existing ones.

  5. Dense time discretization technique for verification of real time systems

    International Nuclear Information System (INIS)

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  6. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  7. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  8. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  9. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  10. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  11. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  12. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  13. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    Science.gov (United States)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  14. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  15. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  16. Irreducible descriptive sets of attributes for information systems

    KAUST Repository

    Moshkov, Mikhail

    2010-01-01

    The maximal consistent extension Ext(S) of a given information system S consists of all objects corresponding to attribute values from S which are consistent with all true and realizable rules extracted from the original information system S. An irreducible descriptive set for the considered information system S is a minimal (relative to the inclusion) set B of attributes which defines exactly the set Ext(S) by means of true and realizable rules constructed over attributes from the considered set B. We show that there exists only one irreducible descriptive set of attributes. We present a polynomial algorithm for this set construction. We also study relationships between the cardinality of irreducible descriptive set of attributes and the number of attributes in S. The obtained results will be useful for the design of concurrent data models from experimental data. © 2010 Springer-Verlag.

  17. Software verification in on-line systems

    International Nuclear Information System (INIS)

    Ehrenberger, W.

    1980-01-01

    Operator assistance is more and more provided by computers. Computers contain programs, whose quality should be above a certain level, before they are allowed to be used in reactor control rooms. Several possibilities for gaining software reliability figures are discussed in this paper. By supervising the testing procedure of a program, one can estimate the number of remaining programming errors. Such an estimation, however, is not very accurate. With mathematical proving procedures one can gain some knowledge on program properties. Such proving procedures are important for the verification of general WHILE-loops, which tend to be error prone. The program analysis decomposes a program into its parts. First the program structure is made visible, which includes the data movements and the control flow. From this analysis test cases can be derived that lead to a complete test. Program analysis can be done by hand or automatically. A statistical program test normally requires a large number of test runs. This number is diminished if details concerning both the program to be tested or its use are known in advance. (orig.)

  18. Research on database realization technology of seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Zhang Huimin; Jing Ping; Sun Peng; Zheng Jiangling

    2005-01-01

    Developing CTBT verification technology has become the most important method that makes sure CTBT to be fulfilled conscientiously. The seismic analysis based on seismic information system (SIS) is playing an important rule in this field. Based on GIS, the SIS will be very sufficient and powerful in spatial analysis, topologic analysis and visualization. However, the critical issue to implement the whole system function depends on the performance of SIS DB. Based on the ArcSDE Geodatabase data model, not only have the spatial data and attribute data seamless integrated management been realized with RDBMS ORACLE really, but also the most functions of ORACLE have been reserved. (authors)

  19. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  20. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used...... to specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  1. Irreducible descriptive sets of attributes for information systems

    KAUST Repository

    Moshkov, Mikhail; Skowron, Andrzej; Suraj, Zbigniew

    2010-01-01

    . An irreducible descriptive set for the considered information system S is a minimal (relative to the inclusion) set B of attributes which defines exactly the set Ext(S) by means of true and realizable rules constructed over attributes from the considered set B

  2. Alien Registration Number Verification via the U.S. Citizenship and Immigration Service's Systematic Alien Verification for Entitlements System

    National Research Council Canada - National Science Library

    Ainslie, Frances M; Buck, Kelly R

    2008-01-01

    The purpose of this study was to evaluate the implications of conducting high-volume automated checks of the United States Citizenship and Immigration Services Systematic Allen Verification for Entitlements System (SAVE...

  3. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  4. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  5. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  6. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  7. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  8. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  9. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  10. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour...

  11. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

  12. Investigation of novel spent fuel verification system for safeguard application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source.

  13. Investigation of novel spent fuel verification system for safeguard application

    International Nuclear Information System (INIS)

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source

  14. Renewable Energy Certificate (REC) Tracking Systems: Costs & Verification Issues (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.

    2013-10-01

    This document provides information on REC tracking systems: how they are used in the voluntary REC market, a comparison of REC systems fees and information regarding how they treat environmental attributes.

  15. Temporal logic runtime verification of dynamic systems

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-07-01

    Full Text Available , this paper provides a novel framework that automatically and verifiably monitors these systems at runtime. The main aim of the framework is to assist the operator through witnesses and counterexamples that are generated during the execution of the system...

  16. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  17. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  18. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  19. Verification on reliability of heat exchanger for primary cooling system

    International Nuclear Information System (INIS)

    Koike, Sumio; Gorai, Shigeru; Onoue, Ryuji; Ohtsuka, Kaoru

    2010-07-01

    Prior to the JMTR refurbishment, verification on reliability of the heat exchangers for primary cooling system was carried out to investigate an integrity of continuously use component. From a result of the significant corrosion, decrease of tube thickness, crack were not observed on the heat exchangers, and integrity of heat exchangers were confirmed. In the long terms usage of the heat exchangers, the maintenance based on periodical inspection and a long-term maintenance plan is scheduled. (author)

  20. System Identification and Verification of Rotorcraft UAVs

    Science.gov (United States)

    Carlton, Zachary M.

    The task of a controls engineer is to design and implement control logic. To complete this task, it helps tremendously to have an accurate model of the system to be controlled. Obtaining a very accurate system model is not a trivial one, as much time and money is usually associated with the development of such a model. A typical physics based approach can require hundreds of hours of flight time. In an iterative process the model is tuned in such a way that it accurately models the physical system's response. This process becomes even more complicated for unstable and highly non-linear systems such as the dynamics of rotorcraft. An alternate approach to solving this problem is to extract an accurate model by analyzing the frequency response of the system. This process involves recording the system's responses for a frequency range of input excitations. From this data, an accurate system model can then be deduced. Furthermore, it has been shown that with use of the software package CIFER® (Comprehensive Identification from FrEquency Responses), this process can both greatly reduce the cost of modeling a dynamic system and produce very accurate results. The topic of this thesis is to apply CIFER® to a quadcopter to extract a system model for the flight condition of hover. The quadcopter itself is comprised of off-the-shelf components with a Pixhack flight controller board running open source Ardupilot controller logic. In this thesis, both the closed and open loop systems are identified. The model is next compared to dissimilar flight data and verified in the time domain. Additionally, the ESC (Electronic Speed Controller) motor/rotor subsystem, which is comprised of all the vehicle's actuators, is also identified. This process required the development of a test bench environment, which included a GUI (Graphical User Interface), data pre and post processing, as well as the augmentation of the flight controller source code. This augmentation of code allowed for

  1. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  2. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  3. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  4. Verification test of control rod system for HTR-10

    International Nuclear Information System (INIS)

    Zhou Huizhong; Diao Xingzhong; Huang Zhiyong; Cao Li; Yang Nianzu

    2002-01-01

    There are 10 sets of control rods and driving devices in 10 MW High Temperature Gas-cooled Test Reactor (HTR-10). The control rod system is the controlling and shutdown system of HTR-10, which is designed for reactor criticality, operation, and shutdown. In order to guarantee technical feasibility, a series of verification tests were performed, including room temperature test, thermal test, test after control rod system installed in HTR-10, and test of control rod system before HTR-10 first criticality. All the tests data showed that driving devices working well, control rods running smoothly up and down, random position settling well, and exactly position indicating

  5. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  6. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  7. On the Symbolic Verification of Timed Systems

    DEFF Research Database (Denmark)

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif

    1999-01-01

    This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...

  8. Tracer verification and monitoring of containment systems

    International Nuclear Information System (INIS)

    Lowry, W.; Dunn, S.D.; Williams, C.

    1996-01-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential for movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Consequently, barriers are primarily intended for use in high-risk sites where few viable alternatives exist to stop the movement of contaminants in the near term. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Existing surface-based and borehole geophysical techniques do not provide the degree of resolution required to assure the formation of an integral in-situ barrier. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) are developing a quantitative subsurface barrier assessment system using gaseous tracers. Called SEAtrace trademark, this system integrates an autonomous, multipoint soil vapor sampling and analysis system with a global optimization modeling methodology to pinpoint leak sources and sizes in real time. SEAtrace trademark is applicable to impermeable barrier emplacements above the water table, providing a conservative assessment of barrier integrity after emplacement, as well as a long term integrity monitoring function. The SEAtrace trademark system is being developed under funding by the DOE-EM Subsurface Contaminant Focus Area

  9. Verification station for Sandia/Rockwell Plutonium Protection system

    International Nuclear Information System (INIS)

    Nicholson, N.; Hastings, R.D.; Henry, C.N.; Millegan, D.R.

    1979-04-01

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  10. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  11. LHC Beam Loss Monitoring System Verification Applications

    CERN Document Server

    Dehning, B; Zamantzas, C; Jackson, S

    2011-01-01

    The LHC Beam Loss Mon­i­tor­ing (BLM) sys­tem is one of the most com­plex in­stru­men­ta­tion sys­tems de­ployed in the LHC. In ad­di­tion to protecting the col­lid­er, the sys­tem also needs to pro­vide a means of di­ag­nos­ing ma­chine faults and de­liv­er a feed­back of loss­es to the control room as well as to sev­er­al sys­tems for their setup and analysis. It has to trans­mit and pro­cess sig­nals from al­most 4’000 mon­i­tors, and has near­ly 3 mil­lion con­fig­urable pa­ram­e­ters. The system was de­signed with re­li­a­bil­i­ty and avail­abil­i­ty in mind. The spec­i­fied op­er­a­tion and the fail-safe­ty stan­dards must be guar­an­teed for the sys­tem to per­form its func­tion in pre­vent­ing su­per­con­duc­tive mag­net de­struc­tion caused by par­ti­cle flux. Main­tain­ing the ex­pect­ed re­li­a­bil­i­ty re­quires ex­ten­sive test­ing and ver­i­fi­ca­tion. In this paper we re­port our most re­cent ad­di­t...

  12. Geometrical verification system using Adobe Photoshop in radiotherapy.

    Science.gov (United States)

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  13. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  14. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  15. Expert system verification and validation survey. Delivery 3: Recommendations

    Science.gov (United States)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  16. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  17. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  18. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  19. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  20. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  1. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  2. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  3. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  4. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  5. FIR signature verification system characterizing dynamics of handwriting features

    Science.gov (United States)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  6. Transacting generation attributes across market boundaries: Compatible information systems and the treatment of imports and exports

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Robert; Wiser, Ryan

    2002-11-01

    Voluntary markets for ''green'' power, and mandatory policies such as fuel source disclosure requirements and renewables portfolio standards, each rely on the ability to differentiate electricity by the ''attributes'' of the generation. Throughout North America, electricity markets are devising accounting and verification systems for generation ''attributes'': those characteristics of a power plant's production such as fuel source and emissions that differentiate it from undifferentiated (or ''commodity'') electricity. These accounting and verification systems are intended to verify compliance with market mandates, create accurate disclosure labels, substantiate green power claims, and support emissions markets. Simultaneously, interest is growing in transacting (importing or exporting) generation attributes across electricity market borders, with or without associated electricity. Cross-border renewable attribute transactions have advantages and disadvantages. Broad access to markets may encourage more renewable generation at lower cost, but this result may conflict with desires to assure that at least some renewable resources are built locally to achieve either local policy goals or purchaser objectives. This report is intended to serve as a resource document for those interested in and struggling with cross-border renewable attribute transactions. The report assesses the circumstances under which renewable generation attributes from a ''source'' region might be recognized in a ''sink'' region. The report identifies several distinct approaches that might be used to account for and verify attribute import and export transactions, and assesses the suitability of these alternative approaches. Because policymakers have often made systems ''compatibility'' between market areas a pre-requisite to allowing cross

  7. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  8. Secure distributed key generation in attribute based encryption systems

    NARCIS (Netherlands)

    Pletea, D.; Sedghi, S.; Veeningen, M.; Petkovic, M.

    2016-01-01

    Nowadays usage of cloud computing is increasing in popularity and this raises new data protection challenges. In such distributed systems it is unrealistic to assume that the servers are fully trusted in enforcing the access policies. Attribute Based Encryption (ABE) is one of the solutions proposed

  9. Specification and verification of the RTOS for plant protection systems

    International Nuclear Information System (INIS)

    Kim, Jin Hyun; Ahn, Young Ah; Lee, Su-Young; Choi, Jin Young; Lee, Na Young

    2004-01-01

    PLC is a computer system for instrumentation and control (I and C) systems such as control of machinery on factory assembly lines. control of machinery on factory assembly lines and Nucleare power plants. In nuclear power industry, systems is classified into 3 classes- Non-safety, safety-related and safety-critical up to integrity on system's using purpose. If PLC is used for controlling reactor in nuclear power plant, it should be identified as safety-critical. PLC has several I and C logics in software, including real-time operating system (RTOS). Hence, RTOS must be also proved that it is safe and reliable by various way and methods. In this paper, we apply formal methods to a development of RTOS for PLC in safety-critical level; Statecharts for specification and model checking for verification. In this paper, we give the results of applying formal methods to RTOS. (author)

  10. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  11. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  12. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  13. Evaluation of an attributive measurement system in the automotive industry

    Science.gov (United States)

    Simion, C.

    2016-08-01

    Measurement System Analysis (MSA) is a critical component for any quality improvement process. MSA is defined as an experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability and it falls into two categories: attribute and variable. Most problematic measurement system issues come from measuring attribute data, which are usually the result of human judgment (visual inspection). Because attributive measurement systems are often used in some manufacturing processes, their assessment is important to obtain the confidence in the inspection process, to see where are the problems in order to eliminate them and to guide the process improvement. It was the aim of this paper to address such a issue presenting a case study made in a local company from the Sibiu region supplying products for the automotive industry, specifically the bag (a technical textile component, i.e. the fabric) for the airbag module. Because defects are inherent in every manufacturing process and in the field of airbag systems a minor defect can influence their performance and lives depend on the safety feature, there is a stringent visual inspection required on the defects of the bag material. The purpose of this attribute MSA was: to determine if all inspectors use the same criteria to determine “pass” from “fail” product (i.e. the fabric); to assess company inspection standards against customer's requirements; to determine how well inspectors are conforming to themselves; to identify how inspectors are conforming to a “known master,” which includes: how often operators ship defective product, how often operators dispose of acceptable product; to discover areas where training is required, procedures must be developed and standards are not available. The results were analyzed using MINITAB software with its module called Attribute Agreement Analysis. The conclusion was that the inspection process must

  14. Incorporation of an item/material attribute system into PAMTRAK

    International Nuclear Information System (INIS)

    Anspach, D.A.; Waddoups, I.G.; Fox, E.T.

    1994-01-01

    The Department of Energy (DOE) mission is changing due to the number of nuclear weapon reductions by the United States and the former Soviet Union with long-term storage requirements for DOE sites increasing. New technology to ensure the integrity of special nuclear material (SNM) in storage is available to sites to supplement manual physical inventories. This allows them to decrease operating costs while keeping radiation exposure at minimal levels. We have developed a generic, real time, personnel tracking and material monitoring system named PAMTRAK. Such a system can significantly reduce the number of required, manual physical inventories at DOE sites while increasing assurance that an insider has not diverted or stolen material. Until recently Pamtrak used only material monitoring devices that provided location/containment attributes. However, Westinghouse Electric Corp. and Metrox, Inc. have recently developed hard-wired item/material attribute systems that monitor both temperature and weight. We have incorporated both of these systems into PAMTRAK. If a site employed one of these item/material attribute systems, it could decrease its manual inventory frequency to three years. This paper describes how a site might implement such a system to meet the DOE's requirements

  15. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  16. A virtual linear accelerator for verification of treatment planning systems

    International Nuclear Information System (INIS)

    Wieslander, Elinore

    2000-01-01

    A virtual linear accelerator is implemented into a commercial pencil-beam-based treatment planning system (TPS) with the purpose of investigating the possibility of verifying the system using a Monte Carlo method. The characterization set for the TPS includes depth doses, profiles and output factors, which is generated by Monte Carlo simulations. The advantage of this method over conventional measurements is that variations in accelerator output are eliminated and more complicated geometries can be used to study the performance of a TPS. The difference between Monte Carlo simulated and TPS calculated profiles and depth doses in the characterization geometry is less than ±2% except for the build-up region. This is of the same order as previously reported results based on measurements. In an inhomogeneous, mediastinum-like case, the deviations between TPS and simulations are small in the unit-density regions. In low-density regions, the TPS overestimates the dose, and the overestimation increases with increasing energy from 3.5% for 6 MV to 9.5% for 18 MV. This result points out the widely known fact that the pencil beam concept does not handle changes in lateral electron transport, nor changes in scatter due to lateral inhomogeneities. It is concluded that verification of a pencil-beam-based TPS with a Monte Carlo based virtual accelerator is possible, which facilitates the verification procedure. (author)

  17. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  18. Use of Commericially Available Software in an Attribute Measurement System

    International Nuclear Information System (INIS)

    MacArthur, Duncan W.; Bracken, David S.; Carrillo, Louis A.; Elmont, Timothy H.; Frame, Katherine C.; Hirsch, Karen L.

    2005-01-01

    A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.

  19. An Attribute Based Access Control Framework for Healthcare System

    Science.gov (United States)

    Afshar, Majid; Samet, Saeed; Hu, Ting

    2018-01-01

    Nowadays, access control is an indispensable part of the Personal Health Record and supplies for its confidentiality by enforcing policies and rules to ensure that only authorized users gain access to requested resources in the system. In other words, the access control means protecting patient privacy in healthcare systems. Attribute-Based Access Control (ABAC) is a new access control model that can be used instead of other traditional types of access control such as Discretionary Access Control, Mandatory Access Control, and Role-Based Access Control. During last five years ABAC has shown some applications in both recent academic fields and industry purposes. ABAC by using user’s attributes and resources, makes a decision according to an access request. In this paper, we propose an ABAC framework for healthcare system. We use the engine of ABAC for rendering and enforcing healthcare policies. Moreover, we handle emergency situations in this framework.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    Science.gov (United States)

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  1. Electroacoustic verification of frequency modulation systems in cochlear implant users.

    Science.gov (United States)

    Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari

    2017-12-26

    The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of

  2. European Train Control System: A Case Study in Formal Verification

    Science.gov (United States)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  3. Real-Time System Verification by Kappa-Induction

    Science.gov (United States)

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  4. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  5. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  6. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  7. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  8. Space Transportation System Availability Requirements and Its Influencing Attributes Relationships

    Science.gov (United States)

    Rhodes, Russell E.; Adams, Timothy C.; McCleskey, Carey M.

    2008-01-01

    It is important that engineering and management accept the need for an availability requirement that is derived with its influencing attributes. It is the intent of this paper to provide the visibility of relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability. Also important to provide bounds of the variables providing engineering the insight required to control the system's engineering solution, e.g., these influencing attributes become design requirements also. These variables will drive the need to provide integration of similar discipline functions or technology selection to allow control of the total parts count. The relationship of selecting a reliability requirement will place a constraint on parts count to achieve a given availability requirement or if allowed to increase the parts count will drive the system reliability requirement higher. They also provide the understanding for the relationship of mean repair time (or mean down time) to maintainability, e.g., accessibility for repair, and both the mean time between failure, e.g., reliability of hardware and availability. The concerns and importance of achieving a strong availability requirement is driven by the need for affordability, the choice of using the two launch solution for the single space application, or the need to control the spare parts count needed to support the long stay in either orbit or on the surface of the moon. Understanding the requirements before starting the architectural design concept will avoid considerable time and money required to iterate the design to meet the redesign and assessment process required to achieve the results required of the customer's space transportation system. In fact the impact to the schedule to being able to deliver the system that meets the customer's needs, goals, and objectives may cause the customer to compromise his desired operational goal and objectives resulting in considerable

  9. Space Transportation System Availability Requirement and Its Influencing Attributes Relationships

    Science.gov (United States)

    Rhodes, Russel E.; Adams, Timothy C.; McCleskey, Carey M.

    2008-01-01

    It is important that engineering and management accept the need for an availability requirement that is derived with its influencing attributes. It is the intent of this paper to provide the visibility of relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability. Also important to provide bounds of the variables providing engineering the insight required to control the system's engineering solution, e.g., these influencing attributes become design requirements also. These variables will drive the need to provide integration of similar discipline functions or technology selection to allow control of the total parts count. The relationship of selecting a reliability requirement will place a constraint on parts count to achieve a given availability requirement or if allowed to increase the parts count will drive the system reliability requirement higher. They also provide the understanding for the relationship of mean repair time (or mean down time) to maintainability, e.g., accessibility for repair, and both the mean time between failure, e.g., reliability of hardware and availability. The concerns and importance of achieving a strong availability requirement is driven by the need for affordability, the choice of using the two launch solution for the single space application, or the need to control the spare parts count needed to support the long stay in either orbit or on the surface of the moon. Understanding the requirements before starting the architectural design concept will avoid considerable time and money required to iterate the design to meet the redesign and assessment process required to achieve the results required of the customer's space transportation system. In fact the impact to the schedule to being able to deliver the system that meets the customer's needs, goals, and objectives may cause the customer to compromise his desired operational goal and objectives resulting in considerable

  10. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  11. Dependent Space and Attribute Reduction on Fuzzy Information System

    Directory of Open Access Journals (Sweden)

    Shu Chang

    2017-01-01

    Full Text Available From equivalence relation RBδ on discourse domain U, we can derive equivalence relation Rδ on the attribute set A. From equivalence relation Rδ on discourse domain A, we can derive a congruence relation on the attribute power set P(A and establish an object dependent space. And then,we discuss the reduction method of fuzzy information system on object dependent space. At last ,the example in this paper demonstrates the feasibility and effectiveness of the reduction method based on the congruence relation Tδ providing an insight into the link between equivalence relation and congruence relation of dependent spaces in the rough set. In this way, the paper can provide powerful theoritical support to the combined using of reduction method, so it is of certain practical value.

  12. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-04] Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer... Following Information Title of Proposal: Enterprise Income Verification (EIV) System- Debts Owed to PHAs and...

  13. : Light Steering Projection Systems and Attributes for HDR Displays

    KAUST Repository

    Damberg, Gerwin

    2017-06-02

    New light steering projectors in cinema form images by moving light away from dark regions into bright areas of an image. In these systems, the peak luminance of small features can far exceed full screen white luminance. In traditional projectors where light is filtered or blocked in order to give shades of gray (or colors), the peak luminance is fixed. The luminance of chromatic features benefit in the same way as white features, and chromatic image details can be reproduced at high brightness leading to a much wider overall color gamut coverage than previously possible. Projectors of this capability are desired by the creative community to aid in and enhance storytelling. Furthermore, reduced light source power requirements of light steering projectors provide additional economic and environmental benefits. While the dependency of peak luminance level on (bright) image feature size is new in the digital cinema space, display technologies with identical characteristics such as OLED, LED LCD and Plasma TVs are well established in the home. Similarly, direct view LED walls are popular in events, advertising and architectural markets. To enable consistent color reproduction across devices in today’s content production pipelines, models that describe modern projectors and display attributes need to evolve together with HDR standards and available metadata. This paper is a first step towards rethinking legacy display descriptors such as contrast, peak luminance and color primaries in light of new display technology. We first summarize recent progress in the field of light steering projectors in cinema and then, based on new projector and existing display characteristics propose the inclusion of two simple display attributes: Maximum Average Luminance and Peak (Color) Primary Luminance. We show that the proposed attributes allow a better prediction of content reproducibility on HDR displays. To validate this assertion, we test professional content on a commercial HDR

  14. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  15. Dual-use benefits of the CTBT verification system

    International Nuclear Information System (INIS)

    Meade, C.E.F.

    1999-01-01

    Since it has been completed in September 1996, the CTBT has been signed by 151 countries. Awaiting the 44 ratifications and entry into force, all of the nuclear powers have imposed unilateral moratoriums on nuclear test explosions. The end of these weapons development activities is often cited as the principal benefit of the CTBT. As the world begins to implement the Treaty, it has become clear that the development and operation of the CTBT verification system will provide a wide range of additional benefits if the data analysis products are available for dual-purpose applications. As this paper describes these could have economic and social implications, especially for countries with limited technical infrastructures. These involve, seismic monitoring, mineral exploration, scientific and technical training

  16. A new approach for the verification of optical systems

    Science.gov (United States)

    Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène

    2013-09-01

    Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Tra­ditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.

  17. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  18. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    Science.gov (United States)

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  19. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  20. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  1. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Poucet, A.; Contini, S.; Bellezza, F.

    2001-01-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  2. A GIS support system for declaration and verification

    Energy Technology Data Exchange (ETDEWEB)

    Poucet, A; Contini, S; Bellezza, F [European Commission, Joint Research Centre, Institute for Systems Informatics and Safety (ISIS), Ispra (Italy)

    2001-07-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  3. System design and verification process for LHC programmable trigger electronics

    CERN Document Server

    Crosetto, D

    1999-01-01

    The rapid evolution of electronics has made it essential to design systems in a technology-independent form that will permit their realization in any future technology. This article describes two practical projects that have been developed for fast, programmable, scalable, modular electronics for the first-level trigger of Large Hadron Collider (LHC) experiments at CERN, Geneva. In both projects, one for the front-end electronics and the second for executing first- level trigger algorithms, the whole system requirements were constrained to two types of replicated components. The overall problem is described, the 3D-Flow design is introduced as a novel solution, and current solutions to the problem are described and compared with the 3D-Flow solution. The design/verification methodology proposed allows the user's real-time system algorithm to be verified down to the gate-level simulation on a technology- independent platform, thus yielding the design for a system that can be implemented with any technology at ...

  4. Verification and Validation of Flight-Critical Systems

    Science.gov (United States)

    Brat, Guillaume

    2010-01-01

    For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.

  5. Tracer verification and monitoring of containment systems (II)

    International Nuclear Information System (INIS)

    Williams, C.V.; Dunn, S.D.; Lowry, W.E.

    1997-01-01

    A tracer verification and monitoring system, SEAtrace trademark, has been designed and field tested which uses gas tracers to evaluate, verify, and monitor the integrity of subsurface barriers. This is accomplished using an automatic, rugged, autonomous monitoring system combined with an inverse optimization code. A gaseous tracer is injected inside the barrier and an array of wells outside the barrier are monitored. When the tracer gas is detected, a global optimization code is used to calculate the leak parameters, including leak size, location, and when the leak began. The multipoint monitoring system operates in real-time, can be used to measure both the tracer gas and soil vapor contaminants, and is capable of unattended operation for long periods of time (months). The global optimization code searches multi-dimensional open-quotes spaceclose quotes to find the best fit for all of the input parameters. These parameters include tracer gas concentration histories from multiple monitoring points, medium properties, barrier location, and the source concentration. SEAtrace trademark does not attempt to model all of the nuances associated with multi-phase, multi-component flow, but rather, the inverse code uses a simplistic forward model which can provide results which are reasonably accurate. The system has calculated leak locations to within 0.5 meters and leak radii to within 0.12 meters

  6. Multiscale observations of CO2, 13CO2, and pollutants at Four Corners for emission verification and attribution

    Science.gov (United States)

    Lindenmaier, Rodica; Dubey, Manvendra K.; Henderson, Bradley G.; Butterfield, Zachary T.; Herman, Jay R.; Rahn, Thom; Lee, Sang-Hyun

    2014-01-01

    There is a pressing need to verify air pollutant and greenhouse gas emissions from anthropogenic fossil energy sources to enforce current and future regulations. We demonstrate the feasibility of using simultaneous remote sensing observations of column abundances of CO2, CO, and NO2 to inform and verify emission inventories. We report, to our knowledge, the first ever simultaneous column enhancements in CO2 (3–10 ppm) and NO2 (1–3 Dobson Units), and evidence of δ13CO2 depletion in an urban region with two large coal-fired power plants with distinct scrubbing technologies that have resulted in ∆NOx/∆CO2 emission ratios that differ by a factor of two. Ground-based total atmospheric column trace gas abundances change synchronously and correlate well with simultaneous in situ point measurements during plume interceptions. Emission ratios of ∆NOx/∆CO2 and ∆SO2/∆CO2 derived from in situ atmospheric observations agree with those reported by in-stack monitors. Forward simulations using in-stack emissions agree with remote column CO2 and NO2 plume observations after fine scale adjustments. Both observed and simulated column ∆NO2/∆CO2 ratios indicate that a large fraction (70–75%) of the region is polluted. We demonstrate that the column emission ratios of ∆NO2/∆CO2 can resolve changes from day-to-day variation in sources with distinct emission factors (clean and dirty power plants, urban, and fires). We apportion these sources by using NO2, SO2, and CO as signatures. Our high-frequency remote sensing observations of CO2 and coemitted pollutants offer promise for the verification of power plant emission factors and abatement technologies from ground and space. PMID:24843169

  7. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  8. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  9. Runtime verification of embedded real-time systems.

    Science.gov (United States)

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  10. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  11. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  12. Quantitative dosimetric verification of an IMRT planning and delivery system

    International Nuclear Information System (INIS)

    Low, D.A.; Mutic, S.; Dempsey, J.F.; Gerber, R.L.; Bosch, W.R.; Perez, C.A.; Purdy, J.A.

    1998-01-01

    Background and purpose: The accuracy of dose calculation and delivery of a commercial serial tomotherapy treatment planning and delivery system (Peacock, NOMOS Corporation) was experimentally determined. Materials and methods: External beam fluence distributions were optimized and delivered to test treatment plan target volumes, including three with cylindrical targets with diameters ranging from 2.0 to 6.2 cm and lengths of 0.9 through 4.8 cm, one using three cylindrical targets and two using C-shaped targets surrounding a critical structure, each with different dose distribution optimization criteria. Computer overlays of film-measured and calculated planar dose distributions were used to assess the dose calculation and delivery spatial accuracy. A 0.125 cm 3 ionization chamber was used to conduct absolute point dosimetry verification. Thermoluminescent dosimetry chips, a small-volume ionization chamber and radiochromic film were used as independent checks of the ion chamber measurements. Results: Spatial localization accuracy was found to be better than ±2.0 mm in the transverse axes (with one exception of 3.0 mm) and ±1.5 mm in the longitudinal axis. Dosimetric verification using single slice delivery versions of the plans showed that the relative dose distribution was accurate to ±2% within and outside the target volumes (in high dose and low dose gradient regions) with a mean and standard deviation for all points of -0.05% and 1.1%, respectively. The absolute dose per monitor unit was found to vary by ±3.5% of the mean value due to the lack of consideration for leakage radiation and the limited scattered radiation integration in the dose calculation algorithm. To deliver the prescribed dose, adjustment of the monitor units by the measured ratio would be required. Conclusions: The treatment planning and delivery system offered suitably accurate spatial registration and dose delivery of serial tomotherapy generated dose distributions. The quantitative dose

  13. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  14. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  15. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  16. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Directory of Open Access Journals (Sweden)

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  17. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  18. Monte Carlo systems used for treatment planning and dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte

  19. Space Transportation System Availability Requirements and Its Influencing Attributes Relationships

    Science.gov (United States)

    Rhodes, Russel E.; Adams, TImothy C.

    2008-01-01

    It is essential that management and engineering understand the need for an availability requirement for the customer's space transportation system as it enables the meeting of his needs, goal, and objectives. There are three types of availability, e.g., operational availability, achieved availability, or inherent availability. The basic definition of availability is equal to the mean uptime divided by the sum of the mean uptime plus the mean downtime. The major difference is the inclusiveness of the functions within the mean downtime and the mean uptime. This paper will address tIe inherent availability which only addresses the mean downtime as that mean time to repair or the time to determine the failed article, remove it, install a replacement article and verify the functionality of the repaired system. The definitions of operational availability include the replacement hardware supply or maintenance delays and other non-design factors in the mean downtime. Also with inherent availability the mean uptime will only consider the mean time between failures (other availability definitions consider this as mean time between maintenance - preventive and corrective maintenance) that requires the repair of the system to be functional. It is also essential that management and engineering understand all influencing attributes relationships to each other and to the resultant inherent availability requirement. This visibility will provide the decision makers with the understanding necessary to place constraints on the design definition for the major drivers that will determine the inherent availability, safety, reliability, maintainability, and the life cycle cost of the fielded system provided the customer. This inherent availability requirement may be driven by the need to use a multiple launch approach to placing humans on the moon or the desire to control the number of spare parts required to support long stays in either orbit or on the surface of the moon or mars. It is

  20. 75 FR 38765 - Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and...

    Science.gov (United States)

    2010-07-06

    ..., facility assessment services, certifications of quantity and quality, import product inspections, and... control number. These include export certification, inspection of section 8e import products, and...] Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and Certification of...

  1. In pursuit of carbon accountability: the politics of REDD+ measuring, reporting and verification systems

    NARCIS (Netherlands)

    Gupta, A.; Lövbrand, E.; Turnhout, E.; Vijge, M.J.

    2012-01-01

    This article reviews critical social science analyses of carbonaccounting and monitoring, reporting and verification (MRV) systems associated with reducing emissions from deforestation, forest degradation and conservation, sustainable use and enhancement of forest carbon stocks (REDD+). REDD+ MRV

  2. Comparative Analysys of Speech Parameters for the Design of Speaker Verification Systems

    National Research Council Canada - National Science Library

    Souza, A

    2001-01-01

    Speaker verification systems are basically composed of three stages: feature extraction, feature processing and comparison of the modified features from speaker voice and from the voice that should be...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: NEW CONDENSATOR, INC.--THE CONDENSATOR DIESEL ENGINE RETROFIT CRANKCASE VENTILATION SYSTEM

    Science.gov (United States)

    EPA's Environmental Technology Verification Program has tested New Condensator Inc.'s Condensator Diesel Engine Retrofit Crankcase Ventilation System. Brake specific fuel consumption (BSFC), the ratio of engine fuel consumption to the engine power output, was evaluated for engine...

  4. Results of verifications of the control automatic exposure in equipment of RX with CR systems

    International Nuclear Information System (INIS)

    Ruiz Manzano, P.; Rivas Ballarin, M. A.; Ortega Pardina, P.; Villa Gazulla, D.; Calvo Carrillo, S.; Canellas Anoz, M.; Millan Cebrian, E.

    2013-01-01

    After the entry into force in 2012, the new Spanish Radiology quality control protocol lists and discusses the results obtained after verification of the automatic control of exposure in computed radiography systems. (Author)

  5. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Data.gov (United States)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  6. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  7. Television system for verification and documentation of treatment fields during intraoperative radiation therapy

    International Nuclear Information System (INIS)

    Fraass, B.A.; Harrington, F.S.; Kinsella, T.J.; Sindelar, W.F.

    1983-01-01

    Intraoperative radiation therapy (IORT) involves direct treatment of tumors or tumor beds with large single doses of radiation. The verification of the area to be treated before irradiation and the documentation of the treated area are critical for IORT, just as for other types of radiation therapy. A television system which allows the target area to be directly imaged immediately before irradiation has been developed. Verification and documentation of treatment fields has made the IORT television system indispensable

  8. Soil attributes of a silvopastoral system in Pernambuco Forest Zone

    Directory of Open Access Journals (Sweden)

    Hugo N.B. Lima

    2018-01-01

    Full Text Available This research evaluated soil properties in a silvopastoral system using double rows of tree legumes. Treatments were signalgrass (Brachiaria decumbens in monoculture or in consortium with sabiá (Mimosa caesalpiniifolia or gliricidia (Gliricidia sepium. Treatments were arranged in a complete randomized block design, with 4 replications. Response variables included chemical characteristics and physical attributes of the soil. Silvopastoral systems had greater (P<0.001 soil exchangeable Ca (gliricidia = 3.2 and sabiá = 3.0 mmolc/dm3 than signalgrass monoculture (2.0 mmolc/dm3. Water infiltration rate was greater within the tree legume double rows (366 mm/h than in signalgrass (162 mm/h (P = 0.02. However, soil moisture was greater in signalgrass pastures (15.9% (P = 0.0020 than in silvopastures (14.9 and 14.8%, where soil moisture levels increased as distance from the tree rows increased. Conversely, the light fraction of soil organic matter was greater within the tree legume double rows than in the grassed area (P = 0.0019. Long-term studies are needed to determine if these benefits accumulate further and the productivity benefits which result.

  9. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  10. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  11. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  12. Verification and synthesis of optimal decision strategies for complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Summers, S. J.

    2013-07-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  13. Verification and synthesis of optimal decision strategies for complex systems

    International Nuclear Information System (INIS)

    Summers, S. J.

    2013-01-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  14. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  15. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  16. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  17. The attribute measurement technique

    International Nuclear Information System (INIS)

    MacArthur, Duncan W.; Langner, Diana; Smith, Morag; Thron, Jonathan; Razinkov, Sergey; Livke, Alexander

    2010-01-01

    Any verification measurement performed on potentially classified nuclear material must satisfy two seemingly contradictory constraints. First and foremost, no classified information can be released. At the same time, the monitoring party must have confidence in the veracity of the measurement. An information barrier (IB) is included in the measurement system to protect the potentially classified information while allowing sufficient information transfer to occur for the monitoring party to gain confidence that the material being measured is consistent with the host's declarations, concerning that material. The attribute measurement technique incorporates an IB and addresses both concerns by measuring several attributes of the nuclear material and displaying unclassified results through green (indicating that the material does possess the specified attribute) and red (indicating that the material does not possess the specified attribute) lights. The attribute measurement technique has been implemented in the AVNG, an attribute measuring system described in other presentations at this conference. In this presentation, we will discuss four techniques used in the AVNG: (1) the 1B, (2) the attribute measurement technique, (3) the use of open and secure modes to increase confidence in the displayed results, and (4) the joint design as a method for addressing both host and monitor needs.

  18. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  19. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  20. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  1. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  2. Enabling the usage of UML in the verification of railway systems: The DAM-rail approach

    International Nuclear Information System (INIS)

    Bernardi, S.; Flammini, F.; Marrone, S.; Mazzocca, N.; Merseguer, J.; Nardone, R.; Vittorini, V.

    2013-01-01

    The need for integration of model-based verification into industrial processes has produced several attempts to define Model-Driven solutions implementing a unifying approach to system development. A recent trend is to implement tool chains supporting the developer both in the design phase and V and V activities. In this Model-Driven context, specific domains require proper modelling approaches, especially for what concerns RAM (Reliability, Availability, Maintainability) analysis and fulfillment of international standards. This paper specifically addresses the definition of a Model-Driven approach for the evaluation of RAM attributes in railway applications to automatically generate formal models. For this aim we extend the MARTE-DAM UML profile with concepts related to maintenance aspects and service degradation, and show that the MARTE-DAM framework can be successfully specialized for the railway domain. Model transformations are then defined to generate Repairable Fault Tree and Bayesian Network models from MARTE-DAM specifications. The whole process is applied to the railway domain in two different availability studies

  3. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  4. The importance of instrumental, symbolic, and environmental attributes for the adoption of smart energy systems

    International Nuclear Information System (INIS)

    Noppers, Ernst H.; Keizer, Kees; Milovanovic, Marko; Steg, Linda

    2016-01-01

    The conceptual model on motivations to adopt sustainable innovations (Noppers et al., 2014) proved to be successful in explaining proxies of the adoption of sustainable innovations: positive evaluations of the utility (instrumental attributes), environmental impact (environmental attributes), and specifically the extent to which the innovation says something about a person (symbolic attributes) increased interest in and intention to adopt sustainable innovations. In this paper, we examined to what extent the evaluations of these three attributes can also explain the actual adoption of smart energy systems that facilitate sustainable energy use. Results showed that adopters of smart energy systems (who agreed to participate in a project in which these systems were tested) evaluated the symbolic attributes of these systems more positively than non-adopters (who did not participate in this project), while both groups did not differ in their evaluation of the instrumental and environmental attributes of smart energy systems. A logistic regression analysis indicated that only evaluations of the symbolic attributes explained actual adoption of smart energy systems. Policy could stress and enhance the symbolic attributes of sustainable innovations to encourage adoption. - Highlights: • What drives consumer adoption of a sustainable innovation? • Evaluation of its symbolic attributes explained adoption of smart energy systems. • Evaluations of its instrumental and environmental attributes did not explain adoption. • Policy could stress and enhance symbolic attributes of smart energy systems.

  5. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  6. REQUIREMENT VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM

    Science.gov (United States)

    2017-09-01

    VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM by Theresa L. Thomas September... ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM 5. FUNDING NUMBERS 6. AUTHOR(S) Theresa L. Thomas 7...CODE 13. ABSTRACT (maximum 200 words) The Naval Air Systems Command (NAVAIR) systems engineering technical review (SETR) process does not

  7. Ongoing Work on Automated Verification of Noisy Nonlinear Systems with Ariadne

    NARCIS (Netherlands)

    Geretti, Luca; Bresolin, Davide; Collins, Pieter; Zivanovic Gonzalez, Sanja; Villa, Tiziano

    2017-01-01

    Cyber-physical systems (CPS) are hybrid systems that commonly consist of a discrete control part that operates in a continuous environment. Hybrid automata are a convenient model for CPS suitable for formal verification. The latter is based on reachability analysis of the system to trace its hybrid

  8. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  9. Cooling Tower (Evaporative Cooling System) Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Boyd, Brian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lewis, Taylor [Colorado Energy Office, Denver, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with cooling tower efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  10. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    done applying conventional methods where requirements and designs are described using natural language, diagrams and pseudo code, and the verification of requirements has been done by code inspection and non-exhaustive testing. These techniques are not sufficient, leading to errors and an in-effective...... for Strategic Research. The work is affiliated with a number of partners: DTU Compute, DTU Transport, DTU Management, DTU Fotonik, Bremen University, Banedanmark, Trafikstyrelsen, DSB, and DSB S-tog. More information about RobustRails project is available at http://www.dtu.dk/subsites/robustrails/English.aspx...

  11. Automated Image Acquisition System for the Verification of Copper-Brass Seal Images

    International Nuclear Information System (INIS)

    Stringa, E.; Bergonzi, C.; Littmann, F.; ); Marszalek, Y.; Tempesta, S.; )

    2015-01-01

    This paper describes a system for the verification of copper-brass seals realized by JRC according to DG ENER requirements. DG ENER processes about 20,000 metal seals per year. The verification of metal seals consists in visually checking the identity of a removed seal. The identity of a copper-brass seal is defined by a random stain pattern realized by the seal producer together with random scratches engraved when the seals are initialized ('seal production'). In order to verify that the seal returned from the field is the expected one its pattern is compared with an image taken during seal production. Formerly, seal initialization and verification were very heavy tasks as seal pictures were acquired with a camera one by one both in the initialization and verification stages. During the initialization the Nuclear Safeguards technicians had to place one by one new seals under a camera and acquire the related reference images. During the verification, the technician had to take used seals and place them one by one under a camera to take new pictures. The new images were presented to the technicians without any preprocessing and the technicians had to recognize the seal. The new station described in this paper has an automated image acquisition system allowing to easily process seals in batches of 100 seals. To simplify the verification, a software automatically centres and rotates the newly acquired seal image in order to perfectly overlap with the reference image acquired during the production phase. The new system significantly speeds up seal production and helps particularly with the demanding task of seal verification. As a large part of the seals is dealt with by a joint Euratom-IAEA team, the IAEA directly profits from this development. The new tool has been in routine use since mid 2013. (author)

  12. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  13. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  14. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  15. TLM.open: a SystemC/TLM Frontend for the CADP Verification Toolbox

    Directory of Open Access Journals (Sweden)

    Claude Helmstetter

    2014-04-01

    Full Text Available SystemC/TLM models, which are C++ programs, allow the simulation of embedded software before hardware low-level descriptions are available and are used as golden models for hardware verification. The verification of the SystemC/TLM models is an important issue since an error in the model can mislead the system designers or reveal an error in the specifications. An open-source simulator for SystemC/TLM is provided but there are no tools for formal verification.In order to apply model checking to a SystemC/TLM model, a semantics for standard C++ code and for specific SystemC/TLM features must be provided. The usual approach relies on the translation of the SystemC/TLM code into a formal language for which a model checker is available.We propose another approach that suppresses the error-prone translation effort. Given a SystemC/TLM program, the transitions are obtained by executing the original code using g++ and an extended SystemC library, and we ask the user to provide additional functions to store the current model state. These additional functions generally represent less than 20% of the size of the original model, and allow it to apply all CADP verification tools to the SystemC/TLM model itself.

  16. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  17. Commissioning of a MOSFET in-vivo patient dose verification system

    International Nuclear Information System (INIS)

    Jenetsky, G.O.; Brown, R.L.

    2004-01-01

    Gy. 5mm and 10mm out of the field MOSFETs and TLDs recorded the same dose. The paired t-test performed on the data taken from 11 separate lens measurements did not show any significant difference between the means at the 95% level of significance (left lens p=0.13, right lens 0.67). The negative y-intercept suggests that they should be added to the raw reading before being converted to dose, this is important when measuring in low dose regions. The problem with a larger than expected variation between energies, can easily be overcome by using individual calibration factors for each energy. For central beam and scatter doses results between TLDs and MOSFETs varied by 1cGy. The greatest variation of 6cGy occurred at beam edge where the dose gradient is greatest. This can be attributed to TLD chips measuring over a 9mm 2 area while a MOSFET measures over 0.04mm 2 . Multiple readings per patient are now being performed relying on slight positional changes to provide a larger area for the MOSFETs to record 'point' measurements. After a careful commissioning procedure, the MOSFET 20 dose verification system provides reliable, consistent and accurate measurements equal to that of a TLD system. However, the MOSFETs can be prepared for use within seconds, and give results of total dose within minutes, and can be re-used instantaneously making them a far more efficient method of patient dose verification. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  18. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed...

  19. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  20. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  1. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    DEFF Research Database (Denmark)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  2. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  3. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  4. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  5. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  6. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  7. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  8. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  9. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  10. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  12. Space Object and Light Attribute Rendering (SOLAR) Projection System

    Science.gov (United States)

    2017-05-08

    depicting the proposed SOLAR projection system. The installation process is shown in Fig. 3. SOLAR system comprises of a dome that houses Digitairum’s...imaging process. A fiberglass dome system was erected to make the SOLAR system a self contained facility. Calibration process was carried out to register...Separate software solutions were implemented to model the light transport processes involved in the imaging process. A fiberglass dome system was erected to

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  14. Selection and verification of safety parameters in safety parameter display system for nuclear power plants

    International Nuclear Information System (INIS)

    Zhang Yuangfang

    1992-02-01

    The method and results for safety parameter selection and its verification in safety parameter display system of nuclear power plants are introduced. According to safety analysis, the overall safety is divided into six critical safety functions, and a certain amount of safety parameters which can represent the integrity degree of each function and the causes of change are strictly selected. The verification of safety parameter selection is carried out from the view of applying the plant emergency procedures and in the accident man oeuvres on a full scale nuclear power plant simulator

  15. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  16. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  17. Sociotechnical attributes of safe and unsafe work systems.

    Science.gov (United States)

    Kleiner, Brian M; Hettinger, Lawrence J; DeJoy, David M; Huang, Yuang-Hsiang; Love, Peter E D

    2015-01-01

    Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social-organisational and technical-work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human-system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human-systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social--organisational and technology--work process factors as they impact work system analysis, design and operation.

  18. Sociotechnical attributes of safe and unsafe work systems

    Science.gov (United States)

    Kleiner, Brian M.; Hettinger, Lawrence J.; DeJoy, David M.; Huang, Yuang-Hsiang; Love, Peter E.D.

    2015-01-01

    Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social–organisational and technical–work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human–system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human–systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. Practitioner Summary: The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social–organisational and technology–work process factors as they impact work system analysis, design and operation. PMID:25909756

  19. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-05] Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... Access, Authorization Form and Rules Of Behavior and User Agreement. OMB Approval Number: 2577-New. Form...

  20. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    International Nuclear Information System (INIS)

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  1. Verification tests for remote controlled inspection system in nuclear power plants

    International Nuclear Information System (INIS)

    Kohno, Tadaaki

    1986-01-01

    Following the increase of nuclear power plants, the total radiation exposure dose accompanying inspection and maintenance works tended to increase. Japan Power Engineering and Inspection Corp. carried out the verification test of a practical power reactor automatic inspection system from November, 1981, to March, 1986, and in this report, the state of having carried out this verification test is described. The objects of the verification test were the equipment which is urgently required for reducing radiation exposure dose, the possibility of realization of which is high, and which is important for ensuring the safety and reliability of plants, that is, an automatic ultrasonic flaw detector for the welded parts of bend pipes, an automatic disassembling and inspection system for control rod driving mechanism, a fuel automatic inspection system, and automatic decontaminating equipments for steam generator water chambers, primary system crud and radioactive gas in coolant. The results of the verification test of these equipments were judged as satisfactory, therefore, the application to actual plants is possible. (Kako, I.)

  2. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  3. Preface of Special issue on Automated Verification of Critical Systems (AVoCS'14)

    NARCIS (Netherlands)

    Huisman, Marieke; van de Pol, Jaco

    2016-01-01

    AVoCS 2014, the 14th International Conference on Automated Verification of Critical Systems has been hosted by the University of Twente, and has taken place in Enschede, Netherlands, on 24–26 September, 2014. The aim of the AVoCS series is to contribute to the interaction and exchange of ideas among

  4. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    Kwon, I. W.; Seong, P. H.

    1996-01-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  5. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  6. A new verification film system for routine quality control of radiation fields: Kodak EC-L.

    Science.gov (United States)

    Hermann, A; Bratengeier, K; Priske, A; Flentje, M

    2000-06-01

    The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.

  7. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  8. Safety verification of non-linear hybrid systems is quasi-decidable

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan

    2014-01-01

    Roč. 44, č. 1 (2014), s. 71-90 ISSN 0925-9856 R&D Projects: GA ČR GCP202/12/J060 Institutional support: RVO:67985807 Keywords : hybrid system s * safety verification * decidability * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 0.875, year: 2014

  9. Formal modeling and verification of systems with self-x properties

    OpenAIRE

    Reif, Wolfgang

    2006-01-01

    Formal modeling and verification of systems with self-x properties / Matthias Güdemann, Frank Ortmeier and Wolfgang Reif. - In: Autonomic and trusted computing : third international conference, ATC 2006, Wuhan, China, September 3-6, 2006 ; proceedings / Laurence T. Yang ... (eds.). - Berlin [u.a.] : Springer, 2006. - S. 38-47. - (Lecture notes in computer science ; 4158)

  10. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    International Nuclear Information System (INIS)

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  11. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  12. Application of plutonium inventory measurement system (PIMS) and temporary canister verification system (TCVS) at RRP

    International Nuclear Information System (INIS)

    Noguchi, Yoshihiko; Nakamura, Hironobu; Adachi, Hideto; Iwamoto, Tomonori

    2004-01-01

    In U-Pu co-denitration area at Rokkasho Reprocessing Plant (RRP), Plutonium Inventory Measurement System (PIMS) and Temporary Canister Verification System (TCVS) are installed to provide efficient and effective safeguards. PIMS measures Pu quantity inside pipes and vessels installed in glove boxes by total neutron counting method. PIMS consists of total 142 neutron detector attached on the wall and top of glove boxes and neutron count rates of each detectors are related to each other to calculate Pu quantity of each process areas. In this moment, inactive calibration using Cf-source was completed. On the other hand, TCVS measures Pu quantity of canisters inside temporary storage by coincidence counting method and it will be installed before the active test. These systems have monitoring function as additional measures. This paper describes specification, performance and measurement principles of PIMS and TCVS. (author)

  13. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  14. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  15. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  16. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  17. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  18. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  19. Tillage, fertilization systems and chemical attributes of a Paleudult

    Directory of Open Access Journals (Sweden)

    Evelyn Penedo Dorneles

    2015-02-01

    Full Text Available Tillage and fertilization methods may affect soil fertility. With the aim of assessing changes in soil chemical properties over a period of ten years, soil samples of a Paleudult were collected over nine seasons at three layer depths (0-5, 5-10, 10-20 cm and were chemically analyzed. Grain yield and nutrient export in two summer crops, soybean (Glycine max and corn (Zea mays, in a field experiment set in Eldorado do Sul, in the state of Rio Grande do Sul, Brazil, were determined. Three soil tillage systems were evaluated, conventional (CT, reduced (RT and no-tillage (NT, combined with mineral (lime and fertilizers and organic (poultry litter fertilization. The no-tillage system stood out as compared to the others, especially in the surface layer, in terms of values of organic matter, soil pH, available phosphorus, cation exchange capacity and base saturation. Phosphorus content was higher under organic than mineral fertilization due to the criteria used for the establishment of fertilizer doses. Under organic fertilization, soil pH values were similar to those obtained in limed soil samples because of the cumulative effect of the organic fertilizer. Soybean yield was lower under NT in comparison to the RT and CT systems. Consequently, soybean grain exported a lower content of nutrients than maize grain. Maize yield was not affected by either tillage or fertilization systems.

  20. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  1. Group attributional training as an effective approach to human resource development under team work systems.

    Science.gov (United States)

    Wang, Z M

    1994-07-01

    An experimental programme of group attributional training under team work system was conducted as part of human resource development in Chinese industrial enterprises. One hundred and ten shopfloor employees participated in the study. Among them, 58 employees took part in the factorial-designed experiment to find out the effects of attributions on performance, and 52 employees of ten work groups participated in the group attributional training programme twice a week for two months. The results showed that the group attributional training was effective in modifying employees' attributional patterns and enhancing group performance and satisfaction. On the basis of the results, an attributional model of work motivation is proposed, and its theoretical and practical implications for human resource management discussed.

  2. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  3. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  4. Crop Evaluation System Optimization: Attribute Weights Determination Based on Rough Sets Theory

    Directory of Open Access Journals (Sweden)

    Ruihong Wang

    2017-01-01

    Full Text Available The present study is mainly a continuation of our previous study, which is about a crop evaluation system development that is based on grey relational analysis. In that system, the attribute weight determination affects the evaluation result directly. Attribute weight is usually ascertained by decision-makers experience knowledge. In this paper, we utilize rough sets theory to calculate attribute significance and then combine it with weight given by decision-maker. This method is a comprehensive consideration of subjective experience knowledge and objective situation; thus it can acquire much more ideal results. Finally, based on this method, we improve the system based on ASP.NET technology.

  5. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  6. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  7. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  8. Modeling and Verification of Dependable Electronic Power System Architecture

    Science.gov (United States)

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  9. Exploration of Scholars’ Attributions of Instructional Barriers in the Context of Pedagogical-Epistemological Belief Systems

    Directory of Open Access Journals (Sweden)

    Yılmaz SOYSAL

    2017-08-01

    Full Text Available In this study, two academics’ who have been employed in a private university teaching barriers and accompanied attributional reasoning were examined in the context of their belief systems with regards to learning, teaching and knowledge. This study was conducted with a basic qualitative perspective. By means of qualitative data gathering and analysis, it was aimed at estimating how the relation between teaching barriers and attributional reasoning was influenced by pedagogical-epistemological belief systems of scholars. Qualitative data was collected through two different semi-structured interview protocols and gathered data was analyzed with an inductive and interpretivist manner. The scholars’ beliefs systems’ divergences (teacher-centered vs. learner-centered allowed to explore the presumable relation of barrier-attribution in the context of pedagogical-epistemological belief systems. It was found out that the scholar who has adopted the more learner-centered modes of teaching favors more internally-oriented, controllable and non-stable attributional styles in illustrating her barrier-attribution relation. In contrast, it was also detected that the scholar who has held more teacher-centered teaching beliefs is liable to make attributions to overly external, non-controllable and stable factors in illuminating her barrier-attribution relation. Major outcomes of the study are evaluated by means of psychological (i.e., attribution theory and instructional (pedagogical-epistemological beliefs lenses and suggestions are offered in the context of higher education.

  10. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality,

  11. Abstractions for Fault-Tolerant Distributed System Verification

    Science.gov (United States)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  12. Rigorous Verification for the Solution of Nonlinear Interval System ...

    African Journals Online (AJOL)

    We survey a general method for solving nonlinear interval systems of equations. In particular, we paid special attention to the computational aspects of linear interval systems since the bulk of computations are done during the stage of computing outer estimation of the including linear interval systems. The height of our ...

  13. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Science.gov (United States)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  14. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    International Nuclear Information System (INIS)

    Haapanen, P.

    1995-01-01

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  15. Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection

    International Nuclear Information System (INIS)

    Muhammad Subekti

    2009-01-01

    Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection. The present research was done for verification of previous developed method on Loss of Coolant Accident (LOCA) detection and perform simulations for knowing the sensitivity of the PWR monitoring system that applied neuro-expert method. The previous research continuing on present research, has developed and has tested the neuro-expert method for several anomaly detections in Nuclear Power Plant (NPP) typed Pressurized Water Reactor (PWR). Neuro-expert can detect the LOCA anomaly with sensitivity of primary coolant leakage of 7 gallon/min and the conventional method could not detect the primary coolant leakage of 30 gallon/min. Neuro expert method detects significantly LOCA anomaly faster than conventional system in Surry-1 NPP as well so that the impact risk is reducible. (author)

  16. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  17. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  18. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  19. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  20. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  1. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Smidts, Carol; Huang, Fuqun; Li, Boyuan; Li, Xiang

    2016-01-01

    from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of 'missing' measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.

  2. Environmental Technology Verification: Biological Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Systems--American Ultraviolet Corporation, DC24-6-120 [EPA600etv08005

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center (APCT Center) is operated by RTI International (RTI), in cooperation with EPA's National Risk Management Research Laboratory. The APCT Center conducts verifications of technologies that clean air in ventilation systems, inc...

  3. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Smidts, Carol [The Ohio State Univ., Columbus, OH (United States); Huang, Funqun [The Ohio State Univ., Columbus, OH (United States); Li, Boyuan [The Ohio State Univ., Columbus, OH (United States); Li, Xiang [The Ohio State Univ., Columbus, OH (United States)

    2016-03-25

    elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.

  4. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    deterministic Markov decision processes from data by actively guiding the selection of input actions. The algorithm is empirically analyzed by learning system models of slot machines, and it is demonstrated that the proposed active learning procedure can significantly reduce the amount of data required...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...... of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  5. International exchange on nuclear safety related expert systems: The role of software verification and validation

    International Nuclear Information System (INIS)

    Sun, B.K.H.

    1996-01-01

    An important lesson learned from the Three Mile Island accident is that human errors can be significant contributors to risk. Recent advancement in computer hardware and software technology helped make expert system techniques potentially viable tools for improving nuclear power plant safety and reliability. As part of the general man-machine interface technology, expert systems have recently become increasingly prominent as a potential solution to a number of previously intractable problems in many phases of human activity, including operation, maintenance, and engineering functions. Traditional methods for testing and analyzing analog systems are no longer adequate to handle the increased complexity of software systems. The role of Verification and Validation (V and V) is to add rigor to the software development and maintenance cycle to guarantee the high level confidence needed for applications. Verification includes the process and techniques for confirming that all the software requirements in one stage of the development are met before proceeding on to the next stage. Validation involves testing the integrated software and hardware system to ensure that it reliably fulfills its intended functions. Only through a comprehensive V and V program can a high level of confidence be achieved. There exist many different standards and techniques for software verification and validation, yet they lack uniform approaches that provides adequate levels of practical guidance which can be used by users for nuclear power plant applications. There is a need to unify different approaches for addressing software verification and validation and to develop practical and cost effective guidelines for user and regulatory acceptance. (author). 8 refs

  6. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  7. Seismic monitoring: a unified system for research and verifications

    International Nuclear Information System (INIS)

    Thigpen, L.

    1979-01-01

    A system for characterizing either a seismic source or geologic media from observational data was developed. This resulted from an examination of the forward and inverse problems of seismology. The system integrates many seismic monitoring research efforts into a single computational capability. Its main advantage is that it unifies computational and research efforts in seismic monitoring. 173 references, 9 figures, 3 tables

  8. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  9. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    present experimental results on rather small systems with high complexity, primarily due to differences between best-case and worst-case execution times. Considering worst-case execution times only, the system becomes deterministic and using a special version of {Uppaal}, where the no history is saved, we...

  10. Verification and validation of the safety parameter display system for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  11. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  12. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  13. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  14. A Verification and Validation Tool for Diagnostic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  15. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Directory of Open Access Journals (Sweden)

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  16. The application of coloured Petri nets to verification of distributed systems specified by message Sequence Charts

    OpenAIRE

    CHERNENOK S.A.; NEPOMNIASCHY V.A.

    2015-01-01

    The language of message sequence charts (MSC) is a popular scenario-based specification language used to describe the interaction of components in distributed systems. However, the methods for validation of MSC diagrams are underdeveloped. This paper describes a method for translation of MSC diagrams into coloured Petri nets (CPN). The method is applied to the property verification of these diagrams. The considered set of diagram elements is extended by the elements of UML sequence diagrams a...

  17. Quantum Mechanics and locality in the K0 K-bar0 system experimental verification possibilities

    International Nuclear Information System (INIS)

    Muller, A.

    1994-11-01

    It is shown that elementary Quantum Mechanics, applied to the K 0 K-bar 0 system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K 0 K-bar 0 experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs

  18. Verification of Security Policy Enforcement in Enterprise Systems

    Science.gov (United States)

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  19. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  20. Measurability and Safety Verification for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger

    2011-01-01

    method that establishes safe upper bounds on reachability probabilities. To arrive there requires us to solve semantic intricacies as well as practical problems. In particular, we show that measurability of a complete system follows from the measurability of its constituent parts. On the practical side......-time behaviour is given by differential equations, as for usual hybrid systems, but the targets of discrete jumps are chosen by probability distributions. These distributions may be general measures on state sets. Also non-determinism is supported, and the latter is exploited in an abstraction and evaluation...

  1. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Science.gov (United States)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  2. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  3. Specification styles in distributed systems design and verification

    NARCIS (Netherlands)

    Vissers, C.A.; Scollo, Giuseppe; van Sinderen, Marten J.; Brinksma, Hendrik

    1991-01-01

    Substantial experience with the use of formal specification languages in the design of distributed systems has shown that finding appropriate structures for formal specifications presents a serious, and often underestimated problem. Its solutions are of great importance for ensuring the quality of

  4. Verification of control system using inverter and canned motor pump

    International Nuclear Information System (INIS)

    Sawada, Yoshiaki; Misato, Hisashi

    2002-01-01

    Control on flow volume and so on of auxiliary systems at power stations is generally carried out by using control valves (CVs), of which numbers and kinds ranges to wide areas. CVs are required for periodical change of packing and so on, of which labor for maintenance is never few. Therefore, to reduce the maintenance of CVs, a system to operate pumps by using an inverter control was investigated. When carrying out flow control by an inverter, valves at output side of pumps was made perfectly open, but because of control on rotation numbers so as to keep required amount excess energy is never consumed. And, by reducing flow volume of a pump, consumed energy is reduced at a rate of its three powers as feature of pumps, so large energy saving effect can be established. Selected canned motor pumps have such characteristics as upgrading of reliability for leakage because of their seal-less ones and extension of periodical inspection period by setting a monitor for abrasion of bearings. As results of some investigations, it could be considered that a control system combining an inverter with a canned motor pump had equal feature as that of a control system using CVs. And, from a test result adding useless time and first order delay element to its control feature forecasting on its application to practical machine could be obtained. (G.K.)

  5. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface

  6. Development and verification of symptom based emergency procedure support system

    International Nuclear Information System (INIS)

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  7. Formal Verification of the Danish Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    in the new Danish interlocking systems. Instantiating the generic model with interlocking configuration data results in a concrete model and high-level safety properties. Using bounded model checking and inductive reasoning, we are able to verify safety properties for model instances corresponding to railway...

  8. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    International Nuclear Information System (INIS)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  9. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  10. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  11. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X; Yin, Y; Lin, X [Shandong Cancer Hospital and Institute, China, Jinan, Shandong (China)

    2016-06-15

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system. Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.

  12. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Directory of Open Access Journals (Sweden)

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  13. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  14. Nondestructive verification and assay systems for spent fuels

    International Nuclear Information System (INIS)

    Cobb, D.D.; Phillips, J.R.; Bosler, G.E.; Eccleston, G.W.; Halbig, J.K.; Hatcher, C.R.; Hsue, S.T.

    1982-04-01

    This is an interim report of a study concerning the potential application of nondestructive measurements on irradiated light-water-reactor (LWR) fuels at spent-fuel storage facilities. It describes nondestructive measurement techniques and instruments that can provide useful data for more effective in-plant nuclear materials management, better safeguards and criticality safety, and more efficient storage of spent LWR fuel. In particular, several nondestructive measurement devices are already available so that utilities can implement new fuel-management and storage technologies for better use of existing spent-fuel storage capacity. The design of an engineered prototype in-plant spent-fuel measurement system is approx. 80% complete. This system would support improved spent-fuel storage and also efficient fissile recovery if spent-fuel reprocessing becomes a reality

  15. Simulated coal gas MCFC power plant system verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  16. Parameterized Verification of Graph Transformation Systems with Whole Neighbourhood Operations

    OpenAIRE

    Delzanno, Giorgio; Stückrath, Jan

    2014-01-01

    We introduce a new class of graph transformation systems in which rewrite rules can be guarded by universally quantified conditions on the neighbourhood of nodes. These conditions are defined via special graph patterns which may be transformed by the rule as well. For the new class for graph rewrite rules, we provide a symbolic procedure working on minimal representations of upward closed sets of configurations. We prove correctness and effectiveness of the procedure by a categorical presenta...

  17. Fault diagnosis for discrete event systems: Modelling and verification

    International Nuclear Information System (INIS)

    Simeu-Abazi, Zineb; Di Mascolo, Maria; Knotek, Michal

    2010-01-01

    This paper proposes an effective way for diagnosis of discrete-event systems using a timed-automaton. It is based on the model-checking technique, thanks to time analysis of the timed model. The paper proposes a method to construct all the timed models and details the different steps used to obtain the diagnosis path. A dynamic model with temporal transitions is proposed in order to model the system. By 'dynamical model', we mean an extension of timed automata for which the faulty states are identified. The model of the studied system contains the faultless functioning states and all the faulty states. Our method is based on the backward exploitation of the dynamic model, where all possible reverse paths are searched. The reverse path is the connection of the faulty state to the initial state. The diagnosis method is based on the coherence between the faulty occurrence time and the reverse path length. A real-world batch process is used to demonstrate the modelling steps and the proposed backward time analysis method to reach the diagnosis results.

  18. Behavioural Verification: Preventing Report Fraud in Decentralized Advert Distribution Systems

    Directory of Open Access Journals (Sweden)

    Stylianos S. Mamais

    2017-11-01

    Full Text Available Service commissions, which are claimed by Ad-Networks and Publishers, are susceptible to forgery as non-human operators are able to artificially create fictitious traffic on digital platforms for the purpose of committing financial fraud. This places a significant strain on Advertisers who have no effective means of differentiating fabricated Ad-Reports from those which correspond to real consumer activity. To address this problem, we contribute an advert reporting system which utilizes opportunistic networking and a blockchain-inspired construction in order to identify authentic Ad-Reports by determining whether they were composed by honest or dishonest users. What constitutes a user’s honesty for our system is the manner in which they access adverts on their mobile device. Dishonest users submit multiple reports over a short period of time while honest users behave as consumers who view adverts at a balanced pace while engaging in typical social activities such as purchasing goods online, moving through space and interacting with other users. We argue that it is hard for dishonest users to fake honest behaviour and we exploit the behavioural patterns of users in order to classify Ad-Reports as real or fabricated. By determining the honesty of the user who submitted a particular report, our system offers a more secure reward-claiming model which protects against fraud while still preserving the user’s anonymity.

  19. The SAMS: Smartphone Addiction Management System and verification.

    Science.gov (United States)

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  20. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Min, Moon Gi; Lee, Jae Ki; Lee, Kwang Hyun; Lee, Dong Il; Lim, Hee Taek [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2017-08-15

    Distributed Control System (DCS) communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units) must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs)

  1. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  2. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    1999-01-01

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  3. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations

  4. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  5. ALGORITHM VERIFICATION FOR A TLD PERSONAL DOSIMETRY SYSTEM

    International Nuclear Information System (INIS)

    SHAHEIN, A.; SOLIMAN, H.A.; MAGHRABY, A.

    2008-01-01

    Dose algorithms are used in thermoluminescence personnel dosimetry for the interpretation of the dosimeter response in terms of equivalent dose. In the present study, an automated Harshaw 6600 reader was vigorously tested prior to the use for dose calculation algorithm according to the standard established by the US Department of Energy Laboratory Accreditation Program (DOELAP). Also, manual Harshaw 4500 reader was used along with the ICRU slab phantom and the RANDO phantom in experimentally determining the photon personal doses in terms of deep dose; Hp(10), shallow dose; Hp(0.07), and eye lens dose; Hp(3). Also, a Monte Carlo simulation program (VMC-dc) free code was used to simulate RANDO phantom irradiation process. The accuracy of the automated system lies well within DOELAP tolerance limits in all test categories

  6. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  7. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  8. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  9. Chemical and physical soil attributes in integrated crop-livestock system under no-tillage

    OpenAIRE

    Silva,Hernani Alves da; Moraes,Anibal de; Carvalho,Paulo César de Faccio; Fonseca,Adriel Ferreira da; Caires,Eduardo Fávero; Dias,Carlos Tadeu dos Santos

    2014-01-01

    Although integrated crop-livestock system (ICLS) under no-tillage (NT) is an attractive practice for intensify agricultural production, little regional information is available on the effects of animal grazing and trampling, particularly dairy heifers, on the soil chemical and physical attributes. The objective of this study was to evaluate the effects of animal grazing on the chemical and physical attributes of the soil after 21 months of ICLS under NT in a succession of annual winter pastur...

  10. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  11. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M.

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  12. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  13. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  14. Fuzzy Controllers for a Gantry Crane System with Experimental Verifications

    Directory of Open Access Journals (Sweden)

    Naif B. Almutairi

    2016-01-01

    Full Text Available The control problem of gantry cranes has attracted the attention of many researchers because of the various applications of these cranes in the industry. In this paper we propose two fuzzy controllers to control the position of the cart of a gantry crane while suppressing the swing angle of the payload. Firstly, we propose a dual PD fuzzy controller where the parameters of each PD controller change as the cart moves toward its desired position, while maintaining a small swing angle of the payload. This controller uses two fuzzy subsystems. Then, we propose a fuzzy controller which is based on heuristics. The rules of this controller are obtained taking into account the knowledge of an experienced crane operator. This controller is unique in that it uses only one fuzzy system to achieve the control objective. The validity of the designed controllers is tested through extensive MATLAB simulations as well as experimental results on a laboratory gantry crane apparatus. The simulation results as well as the experimental results indicate that the proposed fuzzy controllers work well. Moreover, the simulation and the experimental results demonstrate the robustness of the proposed control schemes against output disturbances as well as against uncertainty in some of the parameters of the crane.

  15. Design, analysis, and test verification of advanced encapsulation systems

    Science.gov (United States)

    Garcia, A.; Minning, C.

    1981-01-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  16. Design, analysis, and test verification of advanced encapsulation systems

    Science.gov (United States)

    Garcia, A.; Minning, C.

    1981-11-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  17. Nondestructive verification and assay systems for spent fuels. Technical appendixes

    International Nuclear Information System (INIS)

    Cobb, D.D.; Phillips, J.R.; Baker, M.P.

    1982-04-01

    Six technical appendixes are presented that provide important supporting technical information for the study of the application of nondestructive measurements to spent-fuel storage. Each appendix addresses a particular technical subject in a reasonably self-contained fashion. Appendix A is a comparison of spent-fuel data predicted by reactor operators with measured data from reprocessors. This comparison indicates a rather high level of uncertainty in previous burnup calculations. Appendix B describes a series of nondestructive measurements at the GE-Morris Operation Spent-Fuel Storage Facility. This series of experiments successfully demonstrated a technique for reproducible positioning of fuel assemblies for nondestructive measurement. The experimental results indicate the importance of measuring the axial and angular burnup profiles of irradiated fuel assemblies for quantitative determination of spent-fuel parameters. Appendix C is a reasonably comprehensive bibliography of reports and symposia papers on spent-fuel nondestructive measurements to April 1981. Appendix D is a compendium of spent-fuel calculations that includes isotope production and depletion calculations using the EPRI-CINDER code, calculations of neutron and gamma-ray source terms, and correlations of these sources with burnup and plutonium content. Appendix E describes the pulsed-neutron technique and its potential application to spent-fuel measurements. Although not yet developed, the technique holds the promise of providing separate measurements of the uranium and plutonium fissile isotopes. Appendix F describes the experimental program and facilities at Los Alamos for the development of spent-fuel nondestructive measurement systems. Measurements are reported showing that the active neutron method is sensitive to the replacement of a single fuel rod with a dummy rod in an unirradiated uranium fuel assembly

  18. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  19. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  20. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Energy Technology Data Exchange (ETDEWEB)

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji [Department of Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2016-04-15

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  1. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    International Nuclear Information System (INIS)

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-01-01

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  2. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  3. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  4. VerifEYE: a real-time meat inspection system for the beef processing industry

    Science.gov (United States)

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  5. Measuring diversity in medical reports based on categorized attributes and international classification systems

    Directory of Open Access Journals (Sweden)

    Přečková Petra

    2012-04-01

    Full Text Available Abstract Background Narrative medical reports do not use standardized terminology and often bring insufficient information for statistical processing and medical decision making. Objectives of the paper are to propose a method for measuring diversity in medical reports written in any language, to compare diversities in narrative and structured medical reports and to map attributes and terms to selected classification systems. Methods A new method based on a general concept of f-diversity is proposed for measuring diversity of medical reports in any language. The method is based on categorized attributes recorded in narrative or structured medical reports and on international classification systems. Values of categories are expressed by terms. Using SNOMED CT and ICD 10 we are mapping attributes and terms to predefined codes. We use f-diversities of Gini-Simpson and Number of Categories types to compare diversities of narrative and structured medical reports. The comparison is based on attributes selected from the Minimal Data Model for Cardiology (MDMC. Results We compared diversities of 110 Czech narrative medical reports and 1119 Czech structured medical reports. Selected categorized attributes of MDMC had mostly different numbers of categories and used different terms in narrative and structured reports. We found more than 60% of MDMC attributes in SNOMED CT. We showed that attributes in narrative medical reports had greater diversity than the same attributes in structured medical reports. Further, we replaced each value of category (term used for attributes in narrative medical reports by the closest term and the category used in MDMC for structured medical reports. We found that relative Gini-Simpson diversities in structured medical reports were significantly smaller than those in narrative medical reports except the "Allergy" attribute. Conclusions Terminology in narrative medical reports is not standardized. Therefore it is nearly

  6. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  7. Verification and validation as an integral part of the development of digital systems for nuclear applications

    International Nuclear Information System (INIS)

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  8. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  9. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  10. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  11. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  12. Design Verification Enhancement of FPGA-based Plant Protection System Trip Logics for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jae Cheon; Heo, Gyun Young

    2016-01-01

    As part of strengthening the application of FPGA technology and find solution to its challenges in NPPs, international atomic energy agency (IAEA) has indicated interest by joining sponsorship of Topical Group on FPGA Applications in NPPs (TG-FAN) that hold meetings up to 7th times until now, in form of workshop (International workshop on the application of FPGAs in NPPs) annually since 2008. The workshops attracted a significant interest and had a broad representation of stakeholders such as regulators, utilities, research organizations, system designers, and vendors, from various countries that converge to discuss the current issues regarding instrumentation and control (I and C) systems as well as FPGA applications. Two out of many technical issues identified by the group are lifecycle of FPGA-based platforms, systems, and applications; and methods and tools for V and V. Therefore, in this work, several design steps that involved the use of model-based systems engineering process as well as MATLAB/SIMULINK model which lead to the enhancement of design verification are employed. The verified and validated design output works correctly and effectively. Conclusively, the model-based systems engineering approach and the structural step-by-step design modeling techniques including SIMULINK model utilized in this work have shown how FPGA PPS trip logics design verification can be enhanced. If these design approaches are employ in the design of FPGA-based I and C systems, the design can be easily verified and validated

  13. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  14. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  15. Verification and testing of the RTOS for safety-critical embedded systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Na Young [Seoul National University, Seoul (Korea, Republic of); Kim, Jin Hyun; Choi, Jin Young [Korea University, Seoul (Korea, Republic of); Sung, Ah Young; Choi, Byung Ju [Ewha Womans University, Seoul (Korea, Republic of); Lee, Jang Soo [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system.

  16. Verification and testing of the RTOS for safety-critical embedded systems

    International Nuclear Information System (INIS)

    Lee, Na Young; Kim, Jin Hyun; Choi, Jin Young; Sung, Ah Young; Choi, Byung Ju; Lee, Jang Soo

    2003-01-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system

  17. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  18. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  19. Flexible prototype of modular multilevel converters for experimental verification of DC transmission and multiterminal systems

    DEFF Research Database (Denmark)

    Konstantinou, Georgios; Ceballos, Salvador; Gabiola, Igor

    2017-01-01

    Testing and verification of high-level and low-level control, modulation, fault handling and converter co-ordination for modular multilevel converters (MMCs) requires development of experimental prototype converters. In this paper, we provide a a complete overview of the MMC-based experimental...... prototype at UNSW Sydney (The University of New South Wales) including the structure of the sub-modules, communication, control and protection functions as well as the possible configurations of the system. The prototype, rated at a dc voltage of up to 800 V and power of 20 kVA and can be used to study...

  20. Verification of Treatment Planning System (TPS) on Beam Axis of Co-60 Teletherapy

    International Nuclear Information System (INIS)

    Nunung-Nuraeni; Budhy-Kurniawan; Purwanto; Sugiyantari; Heru-Prasetio; Nasukha

    2001-01-01

    Cancer diseases up to now can be able to be treated by using surgery, chemotherapy and radiotherapy. The need of high level precision and accuracy on radiation dose are very important task. One of task is verification of Treatment Planning System (Tps) to the treatment of patients. The research has been done to verify Tps on beam exis of teletherapy Co-60. Result found that the different between Tps and measurements are about -2.682 % to 1.918% for simple geometry and homogeneous material, 5.278 % to 4.990 % for complex geometry, and -3.202 % to -2.090 % for more complex geometry. (author)

  1. Attributing Agency to Automated Systems: Reflections on Human-Robot Collaborations and Responsibility-Loci.

    Science.gov (United States)

    Nyholm, Sven

    2017-07-18

    Many ethicists writing about automated systems (e.g. self-driving cars and autonomous weapons systems) attribute agency to these systems. Not only that; they seemingly attribute an autonomous or independent form of agency to these machines. This leads some ethicists to worry about responsibility-gaps and retribution-gaps in cases where automated systems harm or kill human beings. In this paper, I consider what sorts of agency it makes sense to attribute to most current forms of automated systems, in particular automated cars and military robots. I argue that whereas it indeed makes sense to attribute different forms of fairly sophisticated agency to these machines, we ought not to regard them as acting on their own, independently of any human beings. Rather, the right way to understand the agency exercised by these machines is in terms of human-robot collaborations, where the humans involved initiate, supervise, and manage the agency of their robotic collaborators. This means, I argue, that there is much less room for justified worries about responsibility-gaps and retribution-gaps than many ethicists think.

  2. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  3. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  4. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    Science.gov (United States)

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  5. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    Science.gov (United States)

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  6. Formal specification and verification of interactive systems with plasticity: Applications to nuclear-plant supervision

    International Nuclear Information System (INIS)

    Oliveira, Raquel Araujo de

    2015-01-01

    The advent of ubiquitous computing and the increasing variety of platforms and devices change user expectations in terms of user interfaces. Systems should be able to adapt themselves to their context of use, i.e., the platform (e.g. a PC or a tablet), the users who interact with the system (e.g. administrators or regular users), and the environment in which the system executes (e.g. a dark room or outdoor). The capacity of a UI to withstand variations in its context of use while preserving usability is called plasticity. Plasticity provides users with different versions of a UI. Although it enhances UI capabilities, plasticity adds complexity to the development of user interfaces: the consistency between multiple versions of a given UI should be ensured. Given the large number of possible versions of a UI, it is time-consuming and error prone to check these requirements by hand. Some automation must be provided to verify plasticity.This complexity is further increased when it comes to UIs of safety-critical systems. Safety-critical systems are systems in which a failure has severe consequences. The complexity of such systems is reflected in the UIs, which are now expected not only to provide correct, intuitive, non-ambiguous and adaptable means for users to accomplish a goal, but also to cope with safety requirements aiming to make sure that systems are reasonably safe before they enter the market. Several techniques to ensure quality of systems in general exist, which can also be used to safety-critical systems. Formal verification provides a rigorous way to perform verification, which is suitable for safety-critical systems. Our contribution is an approach to verify safety-critical interactive systems provided with plastic UIs using formal methods. Using a powerful tool-support, our approach permits:-The verification of sets of properties over a model of the system. Using model checking, our approach permits the verification of properties over the system formal

  7. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  8. A microcomputer system for prescription, calculation, verification and recording of radiotherapy treatments

    International Nuclear Information System (INIS)

    Morrey, D.; Smith, C.W.; Belcher, R.A.; Harding, T.; Sutherland, W.H.

    1982-01-01

    The design of a microcomputer system for the reduction of mistakes in radiotherapy is described. The system covers prescription entry, prescription and treatment calculations, and verification and recording of the treatment set-up. A telecobalt unit was interfaced to the system and in the first 12 months 400 patients have been prescribed and 5000 treatment fields verified. The prescription is entered by the medical officer using an interactive program and this prescription provides the reference for verifying the treatment set-up. The program allows amendments to the prescription to be made easily during the treatment course. The treatment parameters verified are field size, wedge and treatment time. The system uses bar-codes for patient and field identification. A reduction in the number of mistakes has been achieved and future developments are discussed. (author)

  9. SELF: expert system for supporting verification of network operating constraints in power transmission planning

    Energy Technology Data Exchange (ETDEWEB)

    Cicoria, R; Migliardi, P [Ente Nazionale per l` Energia Elettrica, Milan (Italy); Pogliano, P [Centro Informazioni Studi Esperienze (CISE), Milan (Italy)

    1995-06-01

    Performing planned studies into very large HV transmission systems is a very complex task which requires the use of simulation models and the application of the heuristic acquired by expert palnners during previous studies. The ENEL Electric Research Center and the CISE Artificial Intelligence Section have developed a knowledge-based system, named SELF, which is aimed at supporting the transmission system palnner. SELF is capable of assisting the engineer both in finding the convergence of the load flow calculation and determining solutions that respect active power, voltage and VAR operating constraints. This paper describes the overall architecture of the system and shows its integration in a larger planning environment called SPIRA, currently utilized at ENEL. More details are given on the least completed modules, the redispatching and network reinforcement subsystems which deal with active power constraint verification.

  10. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  11. Verification of criticality safety in on-site spent fuel storage systems

    International Nuclear Information System (INIS)

    Rasmussen, R.W.

    1989-01-01

    On February 15, 1984, Duke Power Company received approval for a two-region, burnup credit, spent fuel storage rack design at both Units 1 and 2 of the McGuire Nuclear Station. Duke also hopes to obtain approval by January of 1990 for a dry spent fuel storage system at the Oconee Nuclear Station, which will incorporate the use of burnup credit in the criticality analysis governing the design of the individual storage units. While experiences in burnup verification for criticality safety for their dry storage system at Oconee are in the future, the methods proposed for burnup verification will be similar to those currently used at the McGuire Nuclear Station in the two-region storage racks installed in both pools. In conclusion, the primary benefit of the McGuire rerack effort has obviously been the amount of storage expansion it provided. A total increase of about 2,000 storage cells was realized, 1,000 of which were the result of pursuing the two-region rather than the conventional poison rack design. Less impacting, but equally as important, however, has been the experience gained during the planning, installation, and operation of these storage racks. This experience should prove useful for future rerack efforts likely to occur at Duke's Catawba Nuclear Station as well as for the current dry storage effort underway for the Oconee Nuclear Station

  12. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  13. Verification of IMRT dose distributions using a water beam imaging system

    International Nuclear Information System (INIS)

    Li, J.S.; Boyer, Arthur L.; Ma, C.-M.

    2001-01-01

    A water beam imaging system (WBIS) has been developed and used to verify dose distributions for intensity modulated radiotherapy using dynamic multileaf collimator. This system consisted of a water container, a scintillator screen, a charge-coupled device camera, and a portable personal computer. The scintillation image was captured by the camera. The pixel value in this image indicated the dose value in the scintillation screen. Images of radiation fields of known spatial distributions were used to calibrate the device. The verification was performed by comparing the image acquired from the measurement with a dose distribution from the IMRT plan. Because of light scattering in the scintillator screen, the image was blurred. A correction for this was developed by recognizing that the blur function could be fitted to a multiple Gaussian. The blur function was computed using the measured image of a 10 cmx10 cm x-ray beam and the result of the dose distribution calculated using the Monte Carlo method. Based on the blur function derived using this method, an iterative reconstruction algorithm was applied to recover the dose distribution for an IMRT plan from the measured WBIS image. The reconstructed dose distribution was compared with Monte Carlo simulation result. Reasonable agreement was obtained from the comparison. The proposed approach makes it possible to carry out a real-time comparison of the dose distribution in a transverse plane between the measurement and the reference when we do an IMRT dose verification

  14. Association Between Patient Value Systems and Physician and Practice Attributes Available Online.

    Science.gov (United States)

    Welshhans, Jamie L; Harmon, Jeffrey J; Papel, Ira; Gentile, Richard; Mangat, Devinder; Byrne, Patrick; Collar, Ryan M

    2018-03-01

    The relative value of facial plastic surgeon personal and practice attributes is relevant to the broader health care system because of increasing out-of-pocket expenses to patients. To determine the relative value of specific facial plastic surgeon personal and practice attributes available online from the perspective of patients. This study consisted of an electronic survey sent to patients by email using choice-based conjoint analysis; surveys were sent between December 2015 and March 2016. Participants had agreed to join email registries to be sent email surveys and promotions at 3 private facial plastic and reconstructive surgery practices. The following surgeon personal and practice attributes and levels were compared: (1) outcome transparency (above average, average, not available); (2) surgical training affiliations (US News and World Reports rankings); (3) online rating site scores (2 [poor], 3, or 4 [excellent] stars); and (4) price ($1×, $2×, and $3× [× = $1500; average cost was set at $2×]). The relative importance of outcome transparency, surgical training affiliations, online rating scores, and price to prospective patients. Overall, 291 patients participated for a completion rate of 68%. Outcome transparency was the most valued attribute (attribute utility range = 141; attribute importance = 35.2%). Price was the least valued attribute (attribute utility range = 58.59; attribute importance = 15.1%). Assuming top-tier affiliations and 4-star ratings, share of market (SOM) was 75.5% for surgeons with above-average outcome transparency priced at $3× compared with those surgeons with no outcomes available priced at $1×. Holding price constant at $2×, surgeons with middle-tier affiliations and 2-star online ratings but above average outcomes achieved 48.4% SOM when compared with those surgeons with top-tier affiliations and 4-star online ratings without available outcomes. Facial plastic surgery patients most value surgeons who

  15. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  16. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  17. Issues of verification and validation of application-specific integrated circuits in reactor trip systems

    International Nuclear Information System (INIS)

    Battle, R.E.; Alley, G.T.

    1993-01-01

    Concepts of using application-specific integrated circuits (ASICs) in nuclear reactor safety systems are evaluated. The motivation for this evaluation stems from the difficulty of proving that software-based protection systems are adequately reliable. Important issues concerning the reliability of computers and software are identified and used to evaluate features of ASICS. These concepts indicate that ASICs have several advantages over software for simple systems. The primary advantage of ASICs over software is that verification and validation (V ampersand V) of ASICs can be done with much higher confidence than can be done with software. A method of performing this V ampersand V on ASICS is being developed at Oak Ridge National Laboratory. The purpose of the method's being developed is to help eliminate design and fabrication errors. It will not solve problems with incorrect requirements or specifications

  18. Research on MRV system of iron and steel industry and verification mechanism establishment in China

    Science.gov (United States)

    Guo, Huiting; Chen, Liang; Chen, Jianhua

    2017-12-01

    The national carbon emissions trading market will be launched in 2017 in China. The iron and steel industry will be covered as one of the first industries. Establishing its MRV system is critical to promote the development of the iron and steel industry in the carbon trading market. This paper studies the requirements and procedures of the accounting, monitoring, reporting and verification of the seven iron and steel industry carbon trading pilots. The construction and operating mechanism of the MRV systems are also analyzed. Combining with the emission feature of the iron and steel industry, we study the suitable national MRV system for the whole iron and steel industry to consummate the future national carbon trading framework of iron and steel industry.

  19. An automated portal verification system for the tangential breast portal field

    International Nuclear Information System (INIS)

    Yin, F.-F.; Lai, W.; Chen, C. W.; Nelson, D. F.

    1995-01-01

    Purpose/Objective: In order to ensure the treatment is delivered as planned, a portal image is acquired in the accelerator and is compared to the reference image. At present, this comparison is performed by radiation oncologists based on the manually-identified features, which is both time-consuming and potentially error-prone. With the introduction of various electronic portal imaging devices, real-time patient positioning correction is becoming clinically feasible to replace time-delayed analysis using films. However, this procedure requires present of radiation oncologists during patient treatment which is not cost-effective and practically not realistic. Therefore, the efficiency and quality of radiation therapy could be substantially improved if this procedure can be automated. The purpose of this study is to develop a fully computerized verification system for the radiation therapy of breast cancer for which a similar treatment setup is generally employed. Materials/Methods: The automated verification system involves image acquisition, image feature extraction, feature correlation between reference and portal images, and quantitative evaluation of patient setup. In this study, a matrix liquid ion-chamber EPID was used to acquire digital portal images which is directly attached to Varian CL2100C accelerator. For effective use of computation memory, the 12-bit gray levels in original portal images were quantized to form a range of 8-bit gray levels. A typical breast portal image includes three important components: breast and lung tissues in the treatment field, air space within the treatment field, and non-irradiated region. A hierarchical region processing technique was developed to separate these regions sequentially. The inherent hierarchical features were formulated based on different radiation attenuation for different regions as: treatment field edge -- breast skin line -- chest wall. Initially, a combination of a Canny edge detector and a constrained

  20. A trace-driven analysis of name and attribute caching in a distributed system

    Science.gov (United States)

    Shirriff, Ken W.; Ousterhout, John K.

    1992-01-01

    This paper presents the results of simulating file name and attribute caching on client machines in a distributed file system. The simulation used trace data gathered on a network of about 40 workstations. Caching was found to be advantageous: a cache on each client containing just 10 directories had a 91 percent hit rate on name look ups. Entry-based name caches (holding individual directory entries) had poorer performance for several reasons, resulting in a maximum hit rate of about 83 percent. File attribute caching obtained a 90 percent hit rate with a cache on each machine of the attributes for 30 files. The simulations show that maintaining cache consistency between machines is not a significant problem; only 1 in 400 name component look ups required invalidation of a remotely cached entry. Process migration to remote machines had little effect on caching. Caching was less successful in heavily shared and modified directories such as /tmp, but there weren't enough references to /tmp overall to affect the results significantly. We estimate that adding name and attribute caching to the Sprite operating system could reduce server load by 36 percent and the number of network packets by 30 percent.

  1. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  2. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  3. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  4. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  5. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  6. Verification of absorbed dose calculation with XIO Radiotherapy Treatment Planning System

    International Nuclear Information System (INIS)

    Bokulic, T.; Budanec, M.; Frobe, A.; Gregov, M.; Kusic, Z.; Mlinaric, M.; Mrcela, I.

    2013-01-01

    Modern radiotherapy relies on computerized treatment planning systems (TPS) for absorbed dose calculation. Most TPS require a detailed model of a given machine and therapy beams. International Atomic Energy Agency (IAEA) recommends acceptance testing for the TPS (IAEA-TECDOC-1540). In this study we present customization of those tests for measurements with the purpose of verification of beam models intended for clinical use in our department. Elekta Synergy S linear accelerator installation and data acquisition for Elekta CMS XiO 4.62 TPS was finished in 2011. After the completion of beam modelling in TPS, tests were conducted in accordance with the IAEA protocol for TPS dose calculation verification. The deviations between the measured and calculated dose were recorded for 854 points and 11 groups of tests in a homogenous phantom. Most of the deviations were within tolerance. Similar to previously published results, results for irregular L shaped field and asymmetric wedged fields were out of tolerance for certain groups of points.(author)

  7. Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems

    Science.gov (United States)

    Nguyen, Nhan T.

    2018-01-01

    Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.

  8. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  9. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  10. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  11. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  12. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  13. Remaining Sites Verification Package for the 1607-F3 Sanitary Sewer System, Waste Site Reclassification Form 2006-047

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-04-26

    The 1607-F3 waste site is the former location of the sanitary sewer system that supported the 182-F Pump Station, the 183-F Water Treatment Plant, and the 151-F Substation. The sanitary sewer system included a septic tank, drain field, and associated pipeline, all in use between 1944 and 1965. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  14. Joint verification project on environmentally friendly coal utilization systems. Joint verification project on the water-saving coal preparation system; Kankyo chowagata sekitan riyo system kyodo jissho jigyo. Shosuigata sentan system kyodo jissho jigyo

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    In this verification project, clean technology which should be spread in China was verified and the base structure for its spread was prepared for the purpose of controlling emissions of environmental pollutants associated with the coal utilization in China and of contributing to secure energy acquisition of Japan. As joint verification projects, a general rehabilitation type coal preparation system was installed in the Wangfenggang coal preparation plant, and a central control coal preparation system was installed in the Qingtan coal preparation plant. In the former, a system is verified in which optimum operation, water-saving, high quality, and heightening of efficiency can be obtained by introducing two computing systems for operation control and quality control, various measuring instruments, and analyzers to coal preparation plants where analog operation is conducted helped by Russia and Porland and have problems about quality control. In the latter, a central control system achieving water saving is verified by introducing rapid ash meters, scales, desitometers and computers to coal preparation plants having zigzag or heavy-fluid cyclon and connecting various kinds of information through network. For fiscal 1994, investigation and study were conducted. 51 figs., 9 tabs.

  15. FUSION DECISION FOR A BIMODAL BIOMETRIC VERIFICATION SYSTEM USING SUPPORT VECTOR MACHINE AND ITS VARIATIONS

    Directory of Open Access Journals (Sweden)

    A. Teoh

    2017-12-01

    Full Text Available This paw presents fusion detection technique comparisons based on support vector machine and its variations for a bimodal biometric verification system that makes use of face images and speech utterances. The system is essentially constructed by a face expert, a speech expert and a fusion decision module. Each individual expert has been optimized to operate in automatic mode and designed for security access application. Fusion decision schemes considered are linear, weighted Support Vector Machine (SVM and linear SVM with quadratic transformation. The conditions tested include the balanced and unbalanced conditions between the two experts in order to obtain the optimum fusion module from  these techniques best suited to the target application.

  16. Method and practice on safety software verification and validation for digital reactor protection system

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2010-01-01

    The key issue arising from digitalization of reactor protection system for Nuclear Power Plant (NPP) is in essence, how to carry out Verification and Validation (V and V), to demonstrate and confirm the software is reliable enough to perform reactor safety functions. Among others the most important activity of software V and V process is unit testing. This paper discusses the basic concepts on safety software V and V and the appropriate technique for software unit testing, focusing on such aspects as how to ensure test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed herein was successfully used in the work of unit testing on safety software of a digital reactor protection system. (author)

  17. Verification of HELIOS-MASTER system through benchmark of Halden boiling water reactor (HBWR)

    International Nuclear Information System (INIS)

    Kim, Ha Yong; Song, Jae Seung; Cho, Jin Young; Kim, Kang Seok; Lee, Chung Chan; Zee, Sung Quun

    2004-01-01

    To verify the HELIOS-MASTER computer code system for a nuclear design, we have been performed benchmark calculations for various reactor cores. The Halden reactor is a boiling, heavy water moderated reactor. At a full power of 18-20MWt, the moderator temperature is 240 .deg. C and the pressure is 33 bar. This study describes the verification of the HELIOS-MASTER computer code system for a nuclear design and the analysis of a hexagonal and D 2 O moderated core through a benchmark of the Halden reactor core. HELIOS, developed by Scandpower A/S, is a two-dimensional transport program for the generation of group cross-sections, and MASTER, developed by KAERI, is a three-dimensional nuclear design and analysis code based on the two-group diffusion theory. It solves the neutronics model with the TPEN (Triangle based Polynomial Expansion Nodal) method for a hexagonal geometry

  18. A digital fluoroscopic imaging system for verification during external beam radiotherapy

    International Nuclear Information System (INIS)

    Takai, Michikatsu

    1990-01-01

    A digital fluoroscopic (DF) imaging system has been constructed to obtain portal images for verification during external beam radiotherapy. The imaging device consists of a fluorescent screen viewed by a highly sensitive video camera through a mirror. The video signal is digitized and processed by an image processor which is linked on-line with a host microcomputer. The image quality of the DF system was compared with that of film for portal images of the Burger phantom and the Alderson anthropomorphic phantom using 10 MV X-rays. Contrast resolution of the DF image integrated for 8.5 sec. was superior to the film resolution, while spatial resolution was slightly inferior. The DF image of the Alderson phantom processed by the adaptive histogram equalization was better in showing anatomical landmarks than the film portal image. The DF image integrated for 1 sec. which is used for movie mode can show patient movement during treatment. (author)

  19. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  20. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  1. Clinical evaluation of a mobile digital specimen radiography system for intraoperative specimen verification.

    Science.gov (United States)

    Wang, Yingbing; Ebuoma, Lilian; Saksena, Mansi; Liu, Bob; Specht, Michelle; Rafferty, Elizabeth

    2014-08-01

    Use of mobile digital specimen radiography systems expedites intraoperative verification of excised breast specimens. The purpose of this study was to evaluate the performance of a such a system for verifying targets. A retrospective review included 100 consecutive pairs of breast specimen radiographs. Specimens were imaged in the operating room with a mobile digital specimen radiography system and then with a conventional digital mammography system in the radiology department. Two expert reviewers independently scored each image for image quality on a 3-point scale and confidence in target visualization on a 5-point scale. A target was considered confidently verified only if both reviewers declared the target to be confidently detected. The 100 specimens contained a total of 174 targets, including 85 clips (49%), 53 calcifications (30%), 35 masses (20%), and one architectural distortion (1%). Although a significantly higher percentage of mobile digital specimen radiographs were considered poor quality by at least one reviewer (25%) compared with conventional digital mammograms (1%), 169 targets (97%), were confidently verified with mobile specimen radiography; 172 targets (98%) were verified with conventional digital mammography. Three faint masses were not confidently verified with mobile specimen radiography, and conventional digital mammography was needed for confirmation. One faint mass and one architectural distortion were not confidently verified with either method. Mobile digital specimen radiography allows high diagnostic confidence for verification of target excision in breast specimens across target types, despite lower image quality. Substituting this modality for conventional digital mammography can eliminate delays associated with specimen transport, potentially decreasing surgical duration and increasing operating room throughput.

  2. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  3. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  4. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  5. Standard artifact for the geometric verification of terrestrial laser scanning systems

    Science.gov (United States)

    González-Jorge, H.; Riveiro, B.; Armesto, J.; Arias, P.

    2011-10-01

    Terrestrial laser scanners are geodetic instruments with applications in areas such as architecture, civil engineering or environment. Although it is common to receive the technical specifications of the systems from their manufacturers, there are not any solutions for data verification in the market available for the users. This work proposes a standard artifact and a methodology to perform, in a simple way, the metrology verification of laser scanners. The artifact is manufactured using aluminium and delrin, materials that make the artifact robust and portable. The system consists of a set of five spheres situated at equal distances to one another, and a set of seven cubes of different sizes. A coordinate measuring machine with sub-millimetre precision is used for calibration purposes under controlled environmental conditions. After its calibration, the artifact can be used for the verification of metrology specifications given by manufacturers of laser scanners. The elements of the artifact are destinated to test different metrological characteristics, such as accuracy, precision and resolution. The distance between centres of the spheres is used to obtain the accuracy data, the standard deviation of the top face of the largest cube is used to establish the precision (repeatability) and the error in the measurement of the cubes provides the resolution value in axes X, Y and Z. Methodology for the evaluation is mainly supported by least squares fitting algorithms developed using Matlab programming. The artifact and methodology proposed were tested using a terrestrial laser scanner Riegl LMSZ-390i at three different ranges (10, 30 and 50 m) and four stepwidths (0.002°, 0.005°, 0.010° and 0.020°), both for horizontal and vertical displacements. Results obtained are in agreement with the accuracy and precision data given by the manufacturer, 6 and 4 mm, respectively. On the other hand, important influences between resolution and range and between resolution and

  6. Verification of Strength of the Welded Joints by using of the Aramis Video System

    Directory of Open Access Journals (Sweden)

    Pała Tadeusz

    2017-03-01

    Full Text Available In the paper are presented the results of strength analysis for the two types of the welded joints made according to conventional and laser technologies of high-strength steel S960QC. The hardness distributions, tensile properties and fracture toughness were determined for the weld material and heat affect zone material for both types of the welded joints. Tests results shown on advantage the laser welded joints in comparison to the convention ones. Tensile properties and fracture toughness in all areas of the laser joints have a higher level than in the conventional one. The heat affect zone of the conventional welded joints is a weakness area, where the tensile properties are lower in comparison to the base material. Verification of the tensile tests, which carried out by using the Aramis video system, confirmed this assumption. The highest level of strains was observed in HAZ material and the destruction process occurred also in HAZ of the conventional welded joint.

  7. Verification and implications of the multiple pin treatment in the SASSYS-1 LMR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1994-01-01

    As part of a program to obtain realistic, as opposed to excessively conservative, analysis of reactor transients, a multiple pin treatment for the analysis of intra-subassembly thermal hydraulics has been included in the SASSYS-1 liquid metal reactor systems analysis code. This new treatment has made possible a whole new level of verification for the code. The code can now predict the steady-state and transient responses of individual thermocouples within instrumented subassemlies in a reactor, rather than just predicting average temperatures for a subassembly. Very good agreement has been achieved between code predictions and the experimental measurements of steady-state and transient temperatures and flow rates in the Shutdown Heat Removal Tests in the EBR-II Reactor. Detailed multiple pin calculations for blanket subassemblies in the EBR-II reactor demonstrate that the actual steady-state and transient peak temperatures in these subassemblies are significantly lower than those that would be calculated by simpler models

  8. Beam intensity scanner system for three dimensional dose verification of IMRT

    International Nuclear Information System (INIS)

    Vahc, Young W.; Kwon, Ohyun; Park, Kwangyl; Park, Kyung R.; Yi, Byung Y.; Kim, Keun M.

    2003-01-01

    Patient dose verification is clinically one of the most important parts in the treatment delivery of radiation therapy. The three dimensional (3D) reconstruction of dose distribution delivered to target volume helps to verify patient dose and determine the physical characteristics of beams used in IMRT. Here we present beam intensity scanner (BInS) system for the pre-treatment dosimetric verification of two dimensional photon intensity. The BInS is a radiation detector with a custom-made software for dose conversion of fluorescence signals from scintillator. The scintillator is used to produce fluorescence from the irradiation of 6 MV photons on a Varian Clinac 21EX. The digitized fluoroscopic signals obtained by digital video camera-based scintillator (DVCS) will be processed by our custom made software to reproduce 3D- relative dose distribution. For the intensity modulated beam (IMB), the BInS calculates absorbed dose in absolute beam fluence which is used for the patient dose distribution. Using BInS, we performed various measurements related to IMRT and found the following: (1) The 3D-dose profiles of the IMBs measured by the BInS demonstrate good agreement with radiographic film, pin type ionization chamber and Monte Carlo simulation. (2) The delivered beam intensity is altered by the mechanical and dosimetric properties of the collimation of dynamic and/or step MLC system. This is mostly due to leaf transmission, leaf penumbra scattered photons from the round edges of leaves, and geometry of leaf. (3) The delivered dose depends on the operational detail of how to make multi leaf opening. These phenomena result in a fluence distribution that can be substantially different from the initial and calculated intensity modulation and therefore, should be taken into account by the treatment planning for accurate dose calculations delivered to the target volume in IMRT. (author)

  9. [Study on smoking-attributed mortality by using all causes of death surveillance system in Tianjin].

    Science.gov (United States)

    Jiang, Guohong; Zhang, Hui; Li, Wei; Wang, Dezheng; Xu, Zhongliang; Song, Guide; Zhang, Ying; Shen, Chengfeng; Zheng, Wenlong; Xue, Xiaodan; Shen, Wenda

    2016-03-01

    To understand the smoking-attributed mortality by inclusion of smoking information into all causes of death surveillance. Since 2010, the information about smoking status, smoking history and the number of cigarettes smoked daily had been added in death surveillance system. The measures of training, supervision, check, sampling survey and telephone verifying were taken to increase death reporting rate and reduce data missing rate and underreporting rate. Multivariate logistic regression analysis was conducted to identify risk factors for smoking-attributed mortality. During the study period (2010-2014), the annual death reporting rates ranged from 6.5‰ to 7.0‰. The reporting rates of smoking status, smoking history and the number of cigarettes smoked daily were 95.53%, 98.63% and 98.58%, respectively. Compared with the nonsmokers, the RR of males was 1.38 (1.33-1.43) for all causes of death and 3.07 (2.91-3.24) for lung cancer due to smoking, the RR of females was 1.46 (1.39-1.54) for all causes of death and 4.07 (3.81-4.35) for lung cancer due to smoking, respectively. The study of smoking attributed mortality can be developed with less investment by using the stable and effective all causes of death surveillance system in Tianjin.

  10. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    Science.gov (United States)

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  11. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--CAPSTONE 30KW MICROTURBINE SYSTEM

    Science.gov (United States)

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: RESIDENTIAL ELECTRIC POWER GENERATION USING THE PLUG POWER SU1 FUEL CELL SYSTEM

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Plug Power SU1 Fuel Cell System manufactured by Plug Power. The SU1 is a proton exchange membrane fuel cell that requires hydrogen (H2) as fuel. H2 is generally not available, so the ...

  14. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    Science.gov (United States)

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  15. Density scaling of phantom materials for a 3D dose verification system.

    Science.gov (United States)

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  16. The Effects of information barrier requirements on the trilateral initiative attribute measurement system (AVNG)

    International Nuclear Information System (INIS)

    MacArthur, D.W.; Langner, D.C.; Whiteson, R.; Wolford, J.K.

    2001-01-01

    Although the detection techniques used for measuring classified materials are very similar to those used in unclassified measurements, the surrounding packaging is generally very different. If iZ classified item is to be measured, an information barrier is required to protect any classified data acquired. This information barrier must protect the classified information while giving the inspector confidence that the unclassified outputs accurately reflect the classified inputs, Both information barrier and authentication considerations must be considered during all phases of system design and fabrication. One example of such a measurement system is the attribute measurement system (termed the AVNG) designed for the: Trilateral Initiative. We will discuss the integration of information barrier components into this system as well as the effects of an information barrier (including authentication) concerns on the implementation of the detector systems.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    Science.gov (United States)

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.To facilitate this verification, JCH repre...

  18. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  19. Verification and validation guidelines for high integrity systems: Appendices A--D, Volume 2

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    The following material is furnished as an experimental guide for the use of risk based classification for nuclear plant protection systems. As shown in Sections 2 and 3 of this report, safety classifications for the nuclear field are application based (using the function served as the primary criterion), whereas those in use by the process industry and the military are risk based. There are obvious obstacles to the use of risk based classifications (and the associated integrity levels) for nuclear power plants, yet there are also many potential benefits, including: it considers all capabilities provided for dealing with a specific hazard, thus assigning a lower risk where multiple protection is provided (either at the same or at lower layers); this permits the plant management to perform trade-offs between systems that meet the highest qualification levels or multiple diverse systems at lower qualification levels; it motivates the use (and therefore also the development) of protection systems with demonstrated low failure probability; and it may permit lower cost process industry equipment of an established integrity level to be used in nuclear applications (subject to verification of the integrity level and regulatory approval). The totality of these benefits may reduce the cost of digital protection systems significantly an motivate utilities to much more rapid upgrading of the capabilities than is currently the case. Therefore the outline of a risk based classification is presented here, to serve as a starting point for further investigation and possible trial application

  20. National Accounts Energy Alliance : Field test and verification of CHP components and systems

    Energy Technology Data Exchange (ETDEWEB)

    Sweetser, R. [Exergy Partners Corporation, Herndon, VA (United States)

    2003-07-01

    Exergy is a consulting firm which specializes in capitalizing on opportunities that result from the nexus of utility deregulation and global climate change in both the construction and energy industries. The firm offers assistance in technical business and market planning, product development and high impact marketing and technology transfer programs. The author discussed National Accounts Energy Alliance (NAEA) program on distributed energy resources (DER) and identified some advantageous areas such as homeland security (less possible terrorist targets to be protected), food safety (protection of food supply and delivery system), reliability, power quality, energy density, grid congestion and energy price. In the future, an essential role in moderating energy prices for commercial buildings will probably be played by distributed generation (DG) and combined heat and power (CHP). The technical merits of these technologies is being investigated by national accounts and utilities partnering with non-profit organizations, the United States Department of Energy (US DOE), state governments and industry. In that light, in 2001 an Alliance program was developed, which allows investors to broaden their knowledge from the application and verification of Advanced Energy Technologies. This program was the result of a synergy between the American Gas Foundation and the Gas Technology Institute (GTI), and it assists investors with their strategic planning. It was proven that a customer-led Energy Technology Test and Verification Program (TA and VP) could be cost-effective and successful. The NAEA activities in five locations were reviewed and discussed. They were: (1) Russell Development, Portland, Oregon; (2) A and P-Waldbaums, Hauppage, New York; (3) HEB, Southern, Texas; (4) Cinemark, Plano, Texas; and McDonald's, Tampa, Florida. 4 tabs., figs.

  1. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC films. A semiquantitative comparison.

    Science.gov (United States)

    Geyer, Peter; Blank, Hilbert; Alheit, Horst

    2006-03-01

    The suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. The comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC film as well, using the EC-L cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. In general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. The ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures.

  2. Rangelands Vegetation under Different Management Systems and Growth Stages in North Darfur State, Sudan (Range Attributes

    Directory of Open Access Journals (Sweden)

    Mohamed AAMA Mohamed

    2014-09-01

    Full Text Available This study was conducted at Um Kaddada, North Darfur State, Sudan, at two sites (closed and open for two consecutive seasons 2008 and 2009 during flowering and seed setting stages to evaluate range attributes at the locality. A split plot design was used to study vegetation attributes. Factors studied were management systems (closed and open and growth stages (flowering and seed setting. Vegetation cover, plant density, carrying capacity, and biomass production were assessed. Chemical analyses were done for selected plants to determine their nutritive values. The results showed high significant differences in vegetation attributes (density, cover and biomass production between closed and open areas. Closed areas had higher carrying capacity compared to open rangelands. Crude protein (CP and ash contents of range vegetation were found to decrease while Crude fiber (CF and Dry matter yield (DM had increased with growth. The study concluded that closed rangelands are better than open rangelands because it fenced and protected. Erosion index and vegetation degradation rate were very high. Future research work is needed to assess rangelands characteristics and habitat condition across different ecological zones in North Darfur State, Sudan.DOI: http://dx.doi.org/10.3126/ije.v3i3.11093 International Journal of Environment Vol.3(3 2014: 332-343

  3. Multiple-Robot Systems for USAR: Key Design Attributes and Deployment Issues

    Directory of Open Access Journals (Sweden)

    Choon Yue Wong

    2011-03-01

    Full Text Available The interaction between humans and robots is undergoing an evolution. Progress in this evolution means that humans are close to robustly deploying multiple robots. Urban search and rescue (USAR can benefit greatly from such capability. The review shows that with state of the art artificial intelligence, robots can work autonomously but still require human supervision. It also shows that multiple robot deployment (MRD is more economical, shortens mission durations, adds reliability as well as addresses missions impossible with one robot and payload constraints. By combining robot autonomy and human supervision, the benefits of MRD can be applied to USAR while at the same time minimizing human exposure to danger. This is achieved with a single-human multiple-robot system (SHMRS. However, designers of the SHMRS must consider key attributes such as the size, composition and organizational structure of the robot collective. Variations in these attributes also induce fluctuations in issues within SHMRS deployment such as robot communication and computational load as well as human cognitive workload and situation awareness (SA. Research is essential to determine how the attributes can be manipulated to mitigate these issues while meeting the requirements of the USAR mission.

  4. Multiple-Robot Systems for USAR: Key Design Attributes and Deployment Issues

    Directory of Open Access Journals (Sweden)

    Choon Yue Wong

    2011-03-01

    Full Text Available The interaction between humans and robots is undergoing an evolution. Progress in this evolution means that humans are close to robustly deploying multiple robots. Urban search and rescue (USAR can benefit greatly from such capability. The review shows that with state of the art artificial intelligence, robots can work autonomously but still require human supervision. It also shows that multiple robot deployment (MRD is more economical, shortens mission durations, adds reliability as well as addresses missions impossible with one robot and payload constraints. By combining robot autonomy and human supervision, the benefits of MRD can be applied to USAR while at the same time minimizing human exposure to danger. This is achieved with a single-human multiple-robot system (SHMRS. However, designers of the SHMRS must consider key attributes such as the size, composition and organizational structure of the robot collective. Variations in these attributes also induce fluctuations in issues within SHMRS deployment such as robot communication and computational load as well as human cognitive workload and situation awareness (SA.Research is essential to determine how the attributes can be manipulated to mitigate these issues while meeting the requirements of the USAR mission.

  5. Resilience Attributes of Social-Ecological Systems: Framing Metrics for Management

    Directory of Open Access Journals (Sweden)

    David A. Kerner

    2014-12-01

    Full Text Available If resilience theory is to be of practical value for policy makers and resource managers, the theory must be translated into sensible decision-support tools. We present herein a set of resilience attributes, developed to characterize human-managed systems, that helps system stakeholders to make practical use of resilience concepts in tangible applications. In order to build and maintain resilience, these stakeholders must be able to understand what qualities or attributes enhance—or detract from—a system’s resilience. We describe standardized resilience terms that can be incorporated into resource management plans and decision-support tools to derive metrics that help managers assess the current resilience status of their systems, make rational resource allocation decisions, and track progress toward meeting goals. Our intention is to provide an approachable set of terms for both specialists and non-specialists alike to apply to programs that would benefit from a resilience perspective. These resilience terms can facilitate the modeling of resilience behavior within systems, as well as support those lacking access to sophisticated models. Our goal is to enable policy makers and resource managers to put resilience theory to work in the real world.

  6. Chemical and physical soil attributes in integrated crop-livestock system under no-tillage

    Directory of Open Access Journals (Sweden)

    Hernani Alves da Silva

    Full Text Available Although integrated crop-livestock system (ICLS under no-tillage (NT is an attractive practice for intensify agricultural production, little regional information is available on the effects of animal grazing and trampling, particularly dairy heifers, on the soil chemical and physical attributes. The objective of this study was to evaluate the effects of animal grazing on the chemical and physical attributes of the soil after 21 months of ICLS under NT in a succession of annual winter pastures (2008, soybeans (2008/2009, annual winter pastures (2009, and maize (2009/10. The experiment was performed in the municipality of Castro (PR in a dystrophic Humic Rhodic Hapludox with a clay texture. The treatments included a combination of two pasture (annual ryegrass monoculture and multicropping - annual ryegrass, black oat, white clover and red clover with animal grazing during the fall-winter period with two animal weight categories (light and heavy, in a completely randomized block experimental design with 12 replications. After the maize harvest (21 months after beginning, soil samples were collected at 0-10 and 10-20 cm layers to measure soil chemical and physical attributes. The different combinations of pasture and animal weight did not alter the total organic carbon and nitrogen in the soil, but they influence the attributes of soil acidity and exchangeable cations. The monoculture pasture of ryegrass showed greater soil acidification process compared to the multicropping pasture. When using heavier animals, the multicropping pasture showed lesser increase in soil bulk density and greater macroporosity.

  7. Design verification and acceptance tests of the ASST-A helium refrigeration system

    International Nuclear Information System (INIS)

    Ganni, V.; Apparao, T.V.V.R.

    1993-07-01

    Three similar helium refrigerator systems have been installed at the Superconducting Super Collider Laboratory (SSCL) N15 site; the ASST-A system, which will be used for the accelerator system's full cell string test; the N15-B system, which will be used for string testing in the tunnel; and a third plant, dedicated to magnet testing at the Magnet Testing Laboratory. The ASST-A and N15-B systems will ultimately be a part of the collider's N15 sector station equipment. Each of these three systems has many subsystems, but the design basis for the main refrigerator is the same. Each system has a guaranteed capacity of 2000 W of refrigeration and 20 g/s liquefaction at 4.5K. The testing and design verification of the ASST-A refrigeration system consisted of parametric tests on the compressors and the total system. A summary of the initial performance test data is given in this paper. The tests were conducted for two cases: in the first, all four compressors were operating; in the second, only one compressor in each stage was operating. In each case, tests were conducted in three modes of operation described later on. The process design basis supplied by the manufacturers and used in the design of the main components -- the compressor, and expanders and heat exchangers for the coldbox -- were used to reduce the actual test data using process simulation methodology. In addition, the test results and the process design submitted by the manufacturer were analyzed using exergy analysis. This paper presents both the process and the exergy analyses of the manufacturer's design and the actual test data for Case 1. The process analyses are presented in the form of T-S diagrams. The results of the exergy analyses comparing the exergy losses of each component and the total system for the manufacturer's design and the test data are presented in the tables

  8. Shorter Decentralized Attribute-Based Encryption via Extended Dual System Groups

    Directory of Open Access Journals (Sweden)

    Jie Zhang

    2017-01-01

    Full Text Available Decentralized attribute-based encryption (ABE is a special form of multiauthority ABE systems, in which no central authority and global coordination are required other than creating the common reference parameters. In this paper, we propose a new decentralized ABE in prime-order groups by using extended dual system groups. We formulate some assumptions used to prove the security of our scheme. Our proposed scheme is fully secure under the standard k-Lin assumption in random oracle model and can support any monotone access structures. Compared with existing fully secure decentralized ABE systems, our construction has shorter ciphertexts and secret keys. Moreover, fast decryption is achieved in our system, in which ciphertexts can be decrypted with a constant number of pairings.

  9. Burnup verification measurements at a US nuclear utility using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.; Bosler, G.E.; Walden, G.

    1993-01-01

    The FORK measurement system, designed at Los Alamos National Laboratory (LANL) for the International Atomic Energy Agency (IAEA) safeguards program, has been used to examine spent reactor fuel assemblies at Duke Power Company's Oconee Nuclear Station. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. These measurements can be correlated with burnup and cooling time, and can be used to verify the reactor site records. Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. By taking into account the reduced reactivity of spent fuel due to its burnup in the reactor, burnup credit results in more efficient and economic transport and storage. The objectives of these tests are to demonstrate the applicability of the FORK system to verify reactor records and to develop optimal procedures compatible with utility operations. The test program is a cooperative effort supported by Sandia National Laboratories, the Electric Power Research Institute (EPRI), Los Alamos National Laboratory, and the Duke Power Company

  10. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    Science.gov (United States)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  11. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology, University of Washington, 1959 N. E. Pacific Street, Seattle, Washington 98195 (United States)

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  12. Software Verification and Validation Report for the 244-AR Vault Interim Stabilization Ventilation System

    International Nuclear Information System (INIS)

    YEH, T.

    2002-01-01

    This document reports on the analysis, testing and conclusions of the software verification and validation for the 244-AR Vault Interim Stabilization ventilation system. Automation control system will use the Allen-Bradley software tools for programming and programmable logic controller (PLC) configuration. The 244-AR Interim Stabilization Ventilation System will be used to control the release of radioactive particles to the environment in the containment tent, located inside the canyon of the 244-AR facility, and to assist the waste stabilization efforts. The HVAC equipment, ducts, instruments, PLC hardware, the ladder logic executable software (documented code), and message display terminal are considered part of the temporary ventilation system. The system consists of a supply air skid, temporary ductwork (to distribute airflow), and two skid-mounted, 500-cfm exhausters connected to the east filter building and the vessel vent system. The Interim Stabilization Ventilation System is a temporary, portable ventilation system consisting of supply side and exhaust side. Air is supplied to the containment tent from an air supply skid. This skid contains a constant speed fan, a pre-filter, an electric heating coil, a cooling coil, and a constant flow device (CFD). The CFD uses a passive component that allows a constant flow of air to pass through the device. Air is drawn out of the containment tent, cells, and tanks by two 500-cfm exhauster skids running in parallel. These skids are equipped with fans, filters, stack, stack monitoring instrumentation, and a PLC for control. The 500CFM exhaust skids were fabricated and tested previously for saltwell pumping activities. The objective of the temporary ventilation system is to maintain a higher pressure to the containment tent, relative to the canyon and cell areas, to prevent contaminants from reaching the containment tent

  13. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  14. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons

    2009-07-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  15. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  16. Image acquisition optimization of a limited-angle intrafraction verification (LIVE) system for lung radiotherapy.

    Science.gov (United States)

    Zhang, Yawei; Deng, Xinchen; Yin, Fang-Fang; Ren, Lei

    2018-01-01

    Limited-angle intrafraction verification (LIVE) has been previously developed for four-dimensional (4D) intrafraction target verification either during arc delivery or between three-dimensional (3D)/IMRT beams. Preliminary studies showed that LIVE can accurately estimate the target volume using kV/MV projections acquired over orthogonal view 30° scan angles. Currently, the LIVE imaging acquisition requires slow gantry rotation and is not clinically optimized. The goal of this study is to optimize the image acquisition parameters of LIVE for different patient respiratory periods and gantry rotation speeds for the effective clinical implementation of the system. Limited-angle intrafraction verification imaging acquisition was optimized using a digital anthropomorphic phantom (XCAT) with simulated respiratory periods varying from 3 s to 6 s and gantry rotation speeds varying from 1°/s to 6°/s. LIVE scanning time was optimized by minimizing the number of respiratory cycles needed for the four-dimensional scan, and imaging dose was optimized by minimizing the number of kV and MV projections needed for four-dimensional estimation. The estimation accuracy was evaluated by calculating both the center-of-mass-shift (COMS) and three-dimensional volume-percentage-difference (VPD) between the tumor in estimated images and the ground truth images. The robustness of LIVE was evaluated with varied respiratory patterns, tumor sizes, and tumor locations in XCAT simulation. A dynamic thoracic phantom (CIRS) was used to further validate the optimized imaging schemes from XCAT study with changes of respiratory patterns, tumor sizes, and imaging scanning directions. Respiratory periods, gantry rotation speeds, number of respiratory cycles scanned and number of kV/MV projections acquired were all positively correlated with the estimation accuracy of LIVE. Faster gantry rotation speed or longer respiratory period allowed less respiratory cycles to be scanned and less kV/MV projections

  17. Soil physical and microbiological attributes cultivated with the common bean under two management systems

    Directory of Open Access Journals (Sweden)

    Lorena Adriana De Gennaro

    Full Text Available Agricultural management systems can alter the physical and biological soil quality, interfering with crop development. The objective of this study was to evaluate the physical and microbiological attributes of a Red Latosol, and its relationship to the biometric parameters of the common bean (Phaseolus vulgaris, irrigated and grown under two management systems (conventional tillage and direct seeding, in Campinas in the state of São Paulo, Brazil. The experimental design was of randomised blocks, with a split-plot arrangement for the management system and soil depth, analysed during the 2006/7 and 2007/8 harvest seasons, with 4 replications. The soil physical and microbiological attributes were evaluated at depths of 0.00-0.05, 0.05-0.10, 0.10-0.20 and 0.20-0.40 m. The following were determined for the crop: density, number of pods per plant, number of beans per pod, thousand seed weight, total weight of the shoots and harvest index. Direct seeding resulted in a lower soil physical quality at a depth of 0.00-0.05 m compared to conventional tillage, while the opposite occurred at a depth of 0.05-0.10 m. The direct seeding showed higher soil biological quality, mainly indicated by the microbial biomass nitrogen, basal respiration and metabolic quotient. The biometric parameters in the bean were higher under the direct seeding compared to conventional tillage.

  18. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  19. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  20. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  1. ESTERR-PRO: A Setup Verification Software System Using Electronic Portal Imaging

    Directory of Open Access Journals (Sweden)

    Pantelis A. Asvestas

    2007-01-01

    Full Text Available The purpose of the paper is to present and evaluate the performance of a new software-based registration system for patient setup verification, during radiotherapy, using electronic portal images. The estimation of setup errors, using the proposed system, can be accomplished by means of two alternate registration methods. (a The portal image of the current fraction of the treatment is registered directly with the reference image (digitally reconstructed radiograph (DRR or simulator image using a modified manual technique. (b The portal image of the current fraction of the treatment is registered with the portal image of the first fraction of the treatment (reference portal image by applying a nearly automated technique based on self-organizing maps, whereas the reference portal has already been registered with a DRR or a simulator image. The proposed system was tested on phantom data and on data from six patients. The root mean square error (RMSE of the setup estimates was 0.8±0.3 (mean value ± standard deviation for the phantom data and 0.3±0.3 for the patient data, respectively, by applying the two methodologies. Furthermore, statistical analysis by means of the Wilcoxon nonparametric signed test showed that the results that were obtained by the two methods did not differ significantly (P value >0.05.

  2. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  3. Preliminary Validation and Verification Plan for CAREM Reactor Protection System; Modelo de Plan Preliminar de Validacion y Verificacion para el Sistema de Proteccion del Reactor CAREM

    Energy Technology Data Exchange (ETDEWEB)

    Fittipaldi, Ana; Felix, Maciel [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)

    2000-07-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan.

  4. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  5. Verification of the burn-up of spent fuel assemblies by means of the Consulha containment/surveillance system

    International Nuclear Information System (INIS)

    Daniel, G.; Gourlez, P.

    1991-01-01

    CONSULHA is a containment/surveillance system which has been developed as part of the French Support Programme for the IAEA Safeguards in cooperation with EURATOM and was designed to meet the IAEA EURATOM requirements for the verification of nuclear materials. This system will make it possible to count movements and verify irradiation of spent fuel assemblies in industrial facilities such as reprocessing plants and nuclear reactors

  6. QuaBingo: A Prediction System for Protein Quaternary Structure Attributes Using Block Composition

    Directory of Open Access Journals (Sweden)

    Chi-Hua Tung

    2016-01-01

    Full Text Available Background. Quaternary structures of proteins are closely relevant to gene regulation, signal transduction, and many other biological functions of proteins. In the current study, a new method based on protein-conserved motif composition in block format for feature extraction is proposed, which is termed block composition. Results. The protein quaternary assembly states prediction system which combines blocks with functional domain composition, called QuaBingo, is constructed by three layers of classifiers that can categorize quaternary structural attributes of monomer, homooligomer, and heterooligomer. The building of the first layer classifier uses support vector machines (SVM based on blocks and functional domains of proteins, and the second layer SVM was utilized to process the outputs of the first layer. Finally, the result is determined by the Random Forest of the third layer. We compared the effectiveness of the combination of block composition, functional domain composition, and pseudoamino acid composition of the model. In the 11 kinds of functional protein families, QuaBingo is 23% of Matthews Correlation Coefficient (MCC higher than the existing prediction system. The results also revealed the biological characterization of the top five block compositions. Conclusions. QuaBingo provides better predictive ability for predicting the quaternary structural attributes of proteins.

  7. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  8. VALGALI: VERIFICATION AND VALIDATION TOOL FOR THE LIBRARIES PRODUCED BY THE GALILEE SYSTEM

    International Nuclear Information System (INIS)

    Mengelle, S.

    2011-01-01

    Full text: In this paper we present VALGALI the verification and validation tool for the libraries produced by the nuclear data processing system GALILEE. The aim of this system is to provide libraries with consistent physical data for various application codes (the deterministic transport code APOLLO2, the Monte Carlo transport code TRRIPOLI-4, the depletion code DARWIN, ...). For each library, are the data stored at the good place with the good format and so one. Are the libraries used by the various codes consistent. What is the physical quality of the cross sections and data present in the libraries. These three types of tests correspond to the classic stages of VandV. The great strength of VALGALI is to be generic and not dedicated to one application code Consequently, it is based on a common physical validation database which coverage is regularly increased. For all these test cases, the input data are declined for each relevant application code Moreover it can exist specific test case for each application code: At the present, VALGALI an check and validate the libraries of APOLLO2 and TRIPOLI4, but in the near future VALGALI wil also treat the libraries of DARWIN.

  9. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  10. IMPACTS OF TIMBER LEGALITY VERIFICATION SYSTEM IMPLEMENTATION ON THE SUSTAINABILITY OF TIMBER INDUSTRY AND PRIVATE FOREST

    Directory of Open Access Journals (Sweden)

    Elvida Yosefi Suryandari

    2017-04-01

    Full Text Available International market requires producers to proof the legality of their wood products to address the issues of illegal logging and illegal trade. Timber Legality Verification System (TLVS has been prepared by the Government of Indonesia that covering the upstream and downstream wood industries. This paper aims to evaluate gaps in the implementation of TLVS policy and its impact on the sustainability of timber industry. This study was using gap, descriptive and costs-structure analyzes. The study was conducted in three provinces, namely: DKI Jakarta, West Java and D.I. Yogyakarta. Research found that the effectiveness of the TLVS implementation was low due to relatively rapid policy changes. This situation became disincetive for investments in timber business. Private sector perceived that TLVS policy should be applied in the upstream of timber business. Hence, the industry and market in the downstream have not been fully support to this system. Furthermore, TLVS policy implementation was considered ineffective by timber industry as well as private forest managers, especially by micro industry and smallholder private forests. This situation threatened the sustainability of timber industry and private forests. Therefore, Institutions should be strengthened in order to improve the quality of human resources and the competitiveness of products.

  11. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  12. Clinical evaluation and verification of the hyperthermia treatment planning system hyperplan

    International Nuclear Information System (INIS)

    Gellermann, Johanna; Wust, Peter; Stalling, Dether; Seebass, Martin; Nadobny, Jacek; Beck, Rudolf; Hege, Hans-Christian; Deuflhard, Peter; Felix, Roland

    2000-01-01

    Purpose: A prototype of the hyperthermia treatment planning system (HTPS) HyperPlan for the SIGMA-60 applicator (BSD Medical Corp., Salt Lake City, Utah, USA) has been evaluated with respect to clinical practicability and correctness. Materials and Methods: HyperPlan modules extract tissue boundaries from computed tomography (CT) images to generate regular and tetrahedral grids as patient models, to calculate electric field (E-field) distributions, and to visualize three-dimensional data sets. The finite difference time-domain (FDTD) method is applied to calculate the specific absorption rate (SAR) inside the patient. Temperature distributions are calculated by a finite-element code and can be optimized. HyperPlan was tested on 6 patients with pelvic tumors. For verification, measured SAR values were compared with calculated SAR values. Furthermore, intracorporeal E-field scans were performed and compared with calculated profiles. Results: The HTPS can be applied under clinical conditions. Measured absolute SAR (in W/kg), as well as relative E-field scans, correlated well with calculated values (±20%) using the contour-based FDTD method. Values calculated by applying the FDTD method directly on the voxel (CT) grid, were less well correlated with measured data. Conclusion: The HyperPlan system proved to be clinically feasible, and the results were quantitatively and qualitatively verified for the contour-based FDTD method

  13. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  14. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  15. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Directory of Open Access Journals (Sweden)

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  16. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  17. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    OpenAIRE

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  18. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  19. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  20. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  1. An automatic dose verification system for adaptive radiotherapy for helical tomotherapy

    International Nuclear Information System (INIS)

    Mo, Xiaohu; Chen, Mingli; Parnell, Donald; Olivera, Gustavo; Galmarini, Daniel; Lu, Weiguo

    2014-01-01

    dose verification system that quantifies treatment doses, and provides necessary information for adaptive planning without impeding clinical workflows.

  2. Risk-Informed Monitoring, Verification and Accounting (RI-MVA). An NRAP White Paper Documenting Methods and a Demonstration Model for Risk-Informed MVA System Design and Operations in Geologic Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Unwin, Stephen D.; Sadovsky, Artyom; Sullivan, E. C.; Anderson, Richard M.

    2011-09-30

    This white paper accompanies a demonstration model that implements methods for the risk-informed design of monitoring, verification and accounting (RI-MVA) systems in geologic carbon sequestration projects. The intent is that this model will ultimately be integrated with, or interfaced with, the National Risk Assessment Partnership (NRAP) integrated assessment model (IAM). The RI-MVA methods described here apply optimization techniques in the analytical environment of NRAP risk profiles to allow systematic identification and comparison of the risk and cost attributes of MVA design options.

  3. Fission Meter Information Barrier Attribute Measurement System: Task 1 Report: Document existing Fission Meter neutron IB system

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, P. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-28

    An SNM attribute Information Barrier (IB) system was developed for a 2011 US/UK Exercise. The system was modified and extensively tested in a 2013-2014 US-UK Measurement Campaign. This work demonstrated rapid deployment of an IB system for potential treaty use. The system utilizes an Ortec Fission Meter neutron multiplicity counter and custom computer code. The system demonstrates a proof-of-principle automated Pu-240 mass determination with an information barrier. After a software start command is issued, the system automatically acquires and downloads data, performs an analysis, and displays the results. This system conveys the results of a Pu mass threshold measurements in a way the does not reveal sensitive information. In full IB mode, only red/green ‘lights’ are displayed in the software. In test mode, more detailed information is displayed. The code can also read in, analyze, and display results from previously acquired or simulated data. Because the equipment is commercial-off-the-shelf (COTS), the system demonstrates a low-cost short-lead-time technology for treaty SNM attribute measurements. A deployed system will likely require integration of additional authentication and tamper-indicating technologies. This will be discussed for the project in this and future progress reports.

  4. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  5. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  6. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  7. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  8. Causal Attributions about Disease-Onset and Relapse in Patients with Systemic Vasculitis

    Science.gov (United States)

    Grayson, Peter C.; Amudala, Naomi A.; McAlear, Carol A.; Leduc, Renée L.; Shereff, Denise; Richesson, Rachel; Fraenkel, Liana; Merkel, Peter A.

    2014-01-01

    Objectives Patients vary in their beliefs related to the cause of serious illness. The impact of these beliefs among patients with systemic vasculitis is not known. This study aimed to describe causal attributions about disease-onset and relapse in systemic vasculitis and to examine whether causal beliefs a) differ by type of vasculitis; and b) are associated with negative health outcomes. Methods Patients with vasculitis were recruited to complete an online questionnaire. Categories of causal beliefs were assessed with the Revised Illness Perception Questionnaire (IPQ-R). Differences in beliefs about disease-onset versus relapse were compared across different forms of vasculitis. Causal beliefs were assessed in association with several health outcomes including fatigue, functional impairments, and personal understanding of the condition. Results 692 patients representing 9 forms of vasculitis completed the questionnaire. The majority (90%) of patients had beliefs about the cause of their illness. Causal attributions were highly variable, but altered immunity and stress were the most commonly agreed upon causal beliefs. Frequencies of causal beliefs were strikingly similar across different forms of vasculitis, with few notable exceptions primarily in Behçet’s disease. Beliefs differed about causes of disease-onset versus relapse. Specific beliefs about disease-onset and relapse were weakly associated with fatigue, functional impairments, and understanding of the condition. Conclusion Patient beliefs related to the cause of systemic vasculitis are highly variable. Patterns of causal beliefs are associated with important negative health outcomes. Clinicians who care for patients with vasculitis should be mindful of these associations and consider asking about patients’ causal beliefs. PMID:24634202

  9. Integration of SPICE with TEK LV500 ASIC Design Verification System

    Directory of Open Access Journals (Sweden)

    A. Srivastava

    1996-01-01

    Full Text Available The present work involves integration of the simulation stage of design of a VLSI circuit and its testing stage. The SPICE simulator, TEK LV500 ASIC Design Verification System, and TekWaves, a test program generator for LV500, were integrated. A software interface in ‘C’ language in UNIX ‘solaris 1.x’ environment has been developed between SPICE and the testing tools (TekWAVES and LV500. The function of the software interface developed is multifold. It takes input from either SPICE2G.6 or SPICE 3e.1. The output generated by the interface software can be given as an input to either TekWAVES or LV500. A graphical user interface has also been developed with OPENWlNDOWS using Xview tool kit on SUN workstation. As an example, a two phase clock generator circuit has been considered and usefulness of the software demonstrated. The interface software could be easily linked with VLSI design such as MAGIC layout editor.

  10. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  11. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  12. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  13. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  14. Verification of the computational dosimetry system in JAERI (JCDS) for boron neutron capture therapy

    International Nuclear Information System (INIS)

    Kumada, H; Yamamoto, K; Matsumura, A; Yamamoto, T; Nakagawa, Y; Nakai, K; Kageji, T

    2004-01-01

    Clinical trials for boron neutron capture therapy (BNCT) by using the medical irradiation facility installed in Japan Research Reactor No. 4 (JRR-4) at Japan Atomic Energy Research Institute (JAERI) have been performed since 1999. To carry out the BNCT procedure based on proper treatment planning and its precise implementation, the JAERI computational dosimetry system (JCDS) which is applicable to dose planning has been developed in JAERI. The aim of this study was to verify the performance of JCDS. The experimental data with a cylindrical water phantom were compared with the calculation results using JCDS. Data of measurements obtained from IOBNCT cases at JRR-4 were also compared with retrospective evaluation data with JCDS. In comparison with phantom experiments, the calculations and the measurements for thermal neutron flux and gamma-ray dose were in a good agreement, except at the surface of the phantom. Against the measurements of clinical cases, the discrepancy of JCDS's calculations was approximately 10%. These basic and clinical verifications demonstrated that JCDS has enough performance for the BNCT dosimetry. Further investigations are recommended for precise dose distribution and faster calculation environment

  15. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  16. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  17. Quantum Mechanics and locality in the K{sup 0} K-bar{sup 0} system experimental verification possibilities

    Energy Technology Data Exchange (ETDEWEB)

    Muller, A.

    1994-11-01

    It is shown that elementary Quantum Mechanics, applied to the K{sup 0} K-bar{sup 0} system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K{sup 0} K-bar{sup 0} experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs.

  18. Accelerating SystemVerilog UVM Based VIP to Improve Methodology for Verification of Image Signal Processing Designs Using HW Emulator

    OpenAIRE

    Jain, Abhishek; Gupta, Piyush Kumar; Gupta, Dr. Hima; Dhar, Sachish

    2014-01-01

    In this paper we present the development of Acceleratable UVCs from standard UVCs in SystemVerilog and their usage in UVM based Verification Environment of Image Signal Processing designs to increase run time performance. This paper covers development of Acceleratable UVCs from standard UVCs for internal control and data buses of ST imaging group by partitioning of transaction-level components and cycle-accurate signal-level components between the software simulator and hardware accelerator r...

  19. Hybrid attribute-based recommender system for learning material using genetic algorithm and a multidimensional information model

    Directory of Open Access Journals (Sweden)

    Mojtaba Salehi

    2013-03-01

    Full Text Available In recent years, the explosion of learning materials in the web-based educational systems has caused difficulty of locating appropriate learning materials to learners. A personalized recommendation is an enabling mechanism to overcome information overload occurred in the new learning environments and deliver suitable materials to learners. Since users express their opinions based on some specific attributes of items, this paper proposes a hybrid recommender system for learning materials based on their attributes to improve the accuracy and quality of recommendation. The presented system has two main modules: explicit attribute-based recommender and implicit attribute-based recommender. In the first module, weights of implicit or latent attributes of materials for learner are considered as chromosomes in genetic algorithm then this algorithm optimizes the weights according to historical rating. Then, recommendation is generated by Nearest Neighborhood Algorithm (NNA using the optimized weight vectors implicit attributes that represent the opinions of learners. In the second, preference matrix (PM is introduced that can model the interests of learner based on explicit attributes of learning materials in a multidimensional information model. Then, a new similarity measure between PMs is introduced and recommendations are generated by NNA. The experimental results show that our proposed method outperforms current algorithms on accuracy measures and can alleviate some problems such as cold-start and sparsity.

  20. System for verification in situ of current transformers in high voltage substations; Sistema para verificacao in situ de transformadores de corrente em substacoes de alta tensao

    Energy Technology Data Exchange (ETDEWEB)

    Mendonca, Pedro Henrique; Costa, Marcelo M. da; Dahlke, Diogo B.; Ikeda, Minoru [LACTEC - Instituto de Tecnologia para o Desenvolvimento, Curitiba, PR (Brazil)], Emails: pedro.henrique@lactec.org.br, arinos@lactec.org.br, diogo@lactec.org.br, minoru@lactec.org.br, Celso.melo@copel.com; Carvalho, Joao Claudio D. de [ELETRONORTE, Belem, PR (Brazil)], E-mail: marcelo.melo@eln.gov.br; Teixeira Junior, Jose Arinos [ELETROSUL, Florianopolis, SC (Brazil)], E-mail: jclaudio@eletrosul.gov.br; Melo, Celso F. [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)], E-mail: Celso.melo@copel.com

    2009-07-01

    This work presents an alternative proposal to the execute the calibration of conventional current transformer at the field, using a verification system composed by a optical current transformer as a reference standard, able to installation in extra high voltage bars.

  1. Clinical commissioning of an in vivo range verification system for prostate cancer treatment with anterior and anterior oblique proton beams

    Science.gov (United States)

    Hoesl, M.; Deepak, S.; Moteabbed, M.; Jassens, G.; Orban, J.; Park, Y. K.; Parodi, K.; Bentefour, E. H.; Lu, H. M.

    2016-04-01

    The purpose of this work is the clinical commissioning of a recently developed in vivo range verification system (IRVS) for treatment of prostate cancer by anterior and anterior oblique proton beams. The IRVS is designed to perform a complete workflow for pre-treatment range verification and adjustment. It contains specifically designed dosimetry and electronic hardware and a specific software for workflow control with database connection to the treatment and imaging systems. An essential part of the IRVS system is an array of Si-diode detectors, designed to be mounted to the endorectal water balloon routinely used for prostate immobilization. The diodes can measure dose rate as function of time from which the water equivalent path length (WEPL) and the dose received are extracted. The former is used for pre-treatment beam range verification and correction, if necessary, while the latter is to monitor the dose delivered to patient rectum during the treatment and serves as an additional verification. The entire IRVS workflow was tested for anterior and 30 degree inclined proton beam in both solid water and anthropomorphic pelvic phantoms, with the measured WEPL and rectal doses compared to the treatment plan. Gafchromic films were also used for measurement of the rectal dose and compared to IRVS results. The WEPL measurement accuracy was in the order of 1 mm and after beam range correction, the dose received by the rectal wall were 1.6% and 0.4% from treatment planning, respectively, for the anterior and anterior oblique field. We believe the implementation of IRVS would make the treatment of prostate with anterior proton beams more accurate and reliable.

  2. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC registered films. A semiquantitative comparison

    International Nuclear Information System (INIS)

    Geyer, P.; Blank, H.; Alheit, H.

    2006-01-01

    Background and Purpose: the suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. Material and Methods: the comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC registered film as well, using the EC-L registered cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. Results: in general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. Conclusion: the ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures. (orig.)

  3. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  4. Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems

    Science.gov (United States)

    2008-09-01

    divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that

  5. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  6. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  7. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    International Nuclear Information System (INIS)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  8. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States); Watkins, W

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  9. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    Energy Technology Data Exchange (ETDEWEB)

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G. (DLR, German Aerospace, Cologne); Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  10. Treatment verification system in radiotherapy using a digital portal imaging device. Comparison with screen/film systems

    International Nuclear Information System (INIS)

    Nakata, Manabu; Komai, Yoshinori; Okada, Takashi; Fukumoto, Satoshi; Chadani, Kazuma; Nohara, Hiroki; Kazusa, Chudou.

    1994-01-01

    A digital portal imaging (DPI) system for megavoltage photon beams was installed recently in our department. The purpose of this study is to evaluate the image quality of this system. We have analyzed the following properties of the system; relationship between measured dose-rate and pixel values of the DPI, spatial resolution, detectability of low-contrast objects and setup errors. The results were compared with those of conventional screen-film systems. As a result, the relationship between the measured dose-rate and the pixel value of the DPI was found to be linear in the dose-rate range between 100 and 400 cGy/min. Spatial resolution was 1.25 and 0.5 mm for the DPI and the screen-film systems, respectively. The slope of the contrast-detail curves differed between the DPI and the screen-film systems, the contrast thresholds were 0.6 and 0.3% for the DPI and the screen-film systems, respectively. The detectability of a setup error of 1 mm and 2 mm for the DPI was lower than that by the screen-film systems, although the difference was not very significant. In conclusion, the image quality of the DPI at present time is slightly inferior to the conventional screen-film systems. However, notable advantages of the DPI system are that any positional changes in patients during irradiation can be detected very quickly, and that quantitative analysis of the setup variation can be obtained. The image quality of the DPI will be improved as the technology regarding advances. Therefore, this verification system using the DPI device, is expected to be used for clinical radiation therapy in the future. (author)

  11. Microbiological and faunal soil attributes of coffee cultivation under different management systems in Brazil

    Directory of Open Access Journals (Sweden)

    D. R. Lammel

    Full Text Available Abstract Brazil is the biggest coffee producer in the world and different plantation management systems have been applied to improve sustainability and soil quality. Little is known about the environmental effects of these different management systems, therefore, the goal of this study was to use soil biological parameters as indicators of changes. Soils from plantations in Southeastern Brazil with conventional (CC, organic (OC and integrated management systems containing intercropping of Brachiaria decumbens (IB or Arachis pintoi (IA were sampled. Total organic carbon (TOC, microbial biomass carbon (MBC and nitrogen (MBN, microbial activity (C-CO2, metabolic quotient (qCO2, the enzymes dehydrogenase, urease, acid phosphatase and arylsulphatase, arbuscular mycorrhizal fungi (AMF colonization and number of spores and soil fauna were evaluated. The greatest difference between the management systems was seen in soil organic matter content. The largest quantity of TOC was found in the OC, and the smallest was found in IA. TOC content influenced soil biological parameters. The use of all combined attributes was necessary to distinguish the four systems. Each management presented distinct faunal structure, and the data obtained with the trap method was more reliable than the TSBF (Tropical Soils method. A canonic correlation analysis showed that Isopoda was correlated with TOC and the most abundant order with OC. Isoptera was the most abundant faunal order in IA and correlated with MBC. Overall, OC had higher values for most of the biological measurements and higher populations of Oligochaeta and Isopoda, corroborating with the concept that the OC is a more sustainable system.

  12. Microbiological and faunal soil attributes of coffee cultivation under different management systems in Brazil.

    Science.gov (United States)

    Lammel, D R; Azevedo, L C B; Paula, A M; Armas, R D; Baretta, D; Cardoso, E J B N

    2015-11-01

    Brazil is the biggest coffee producer in the world and different plantation management systems have been applied to improve sustainability and soil quality. Little is known about the environmental effects of these different management systems, therefore, the goal of this study was to use soil biological parameters as indicators of changes. Soils from plantations in Southeastern Brazil with conventional (CC), organic (OC) and integrated management systems containing intercropping of Brachiaria decumbens (IB) or Arachis pintoi (IA) were sampled. Total organic carbon (TOC), microbial biomass carbon (MBC) and nitrogen (MBN), microbial activity (C-CO2), metabolic quotient (qCO2), the enzymes dehydrogenase, urease, acid phosphatase and arylsulphatase, arbuscular mycorrhizal fungi (AMF) colonization and number of spores and soil fauna were evaluated. The greatest difference between the management systems was seen in soil organic matter content. The largest quantity of TOC was found in the OC, and the smallest was found in IA. TOC content influenced soil biological parameters. The use of all combined attributes was necessary to distinguish the four systems. Each management presented distinct faunal structure, and the data obtained with the trap method was more reliable than the TSBF (Tropical Soils) method. A canonic correlation analysis showed that Isopoda was correlated with TOC and the most abundant order with OC. Isoptera was the most abundant faunal order in IA and correlated with MBC. Overall, OC had higher values for most of the biological measurements and higher populations of Oligochaeta and Isopoda, corroborating with the concept that the OC is a more sustainable system.

  13. Comparison of monitor units calculated by radiotherapy treatment planning system and an independent monitor unit verification software.

    Science.gov (United States)

    Sellakumar, P; Arun, C; Sanjay, S S; Ramesh, S B

    2011-01-01

    In radiation therapy, the monitor units (MU) needed to deliver a treatment plan are calculated by treatment planning systems (TPS). The essential part of quality assurance is to verify the MU with independent monitor unit calculation to correct any potential errors prior to the start of treatment. In this study, we have compared the MU calculated by TPS and by independent MU verification software. The MU verification software was commissioned and tested for the data integrity to ensure that the correct beam data was considered for MU calculations. The accuracy of the calculations was tested by creating a series of test plans and comparing them with ion chamber measurements. The results show that there is good agreement between the two. The MU difference (MUdiff) between the monitor unit calculations of TPS and independent MU verification system was calculated for 623 fields from 245 patients and was analyzed by treatment site for head & neck, thorax, breast, abdomen and pelvis. The mean MUdiff of -0.838% with a standard deviation of 3.04% was observed for all 623 fields. The site specific standard deviation of MUdiff was as follows: abdomen and pelvis (<1.75%), head & neck (2.5%), thorax (2.32%) and breast (6.01%). The disparities were analyzed and different correction methods were used to reduce the disparity. © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  15. In situ object counting system (ISOCSi3TM) technique: A cost-effective tool for NDA verification in IAEA Safeguards

    International Nuclear Information System (INIS)

    Nizhnik, V.; Belian, A.; Shephard, A.; Lebrun, A.

    2011-01-01

    Nuclear material measurements using the ISOCS technique are playing an increasing role in IAEA verification activities. The ISOCS capabilities include: a high sensitivity to the presence of U and Pu; the ability to detect very small amounts of material; and the ability to measure items of different shapes and sizes. In addition, the numerical absolute efficiency calibration of a germanium detector used in the technique does not require any calibration standards or reference materials. The ISOCS modelling software performs an absolute efficiency calibration for items with various container shapes, container wall materials, material compositions, material fill-heights, U/Pu weight fractions and even heterogeneously distributed emitting materials. In a number of cases, some key parameters, such as the matrix density and U/Pu weight fraction, can be determined in addition to the emitting material mass and isotopic composition. These capabilities provide a verification solution suitable for a majority of cases where quantitative and isotopic analysis should be performed. Taking into account these advantages, the technique becomes a cost-effective solution for nuclear material non-destructive assay (NDA) verification. At present, the IAEA uses the ISOCS for a wide range of applications including the quantitative analysis of U scrap materials, U/Pu contaminated solid wastes, U fuel elements, U hold-up materials. Additionally, the ISOCS is also applied to some specific verification cases such as the measurement of PuBe neutron sources and the quantification of fission products in solid wastes. In reprocessing facilities with U/Pu waste compaction or facilities with item re-batching, the continuity-of-knowledge can be assured by applying either video surveillance systems together with seals (requiring attaching/detaching and verification activities for each seal) or verification of operator declarations using quantitative measurements for items selected on a random basis

  16. Runtime Instrumentation of SystemC/TLM2 Interfaces for Fault Tolerance Requirements Verification in Software Cosimulation

    Directory of Open Access Journals (Sweden)

    Antonio da Silva

    2014-01-01

    Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.

  17. TU-FG-BRB-05: A 3 Dimensional Prompt Gamma Imaging System for Range Verification in Proton Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Draeger, E; Chen, H; Polf, J [University of Maryland School of Medicine, Baltimore, MD (United States); Mackin, D; Beddar, S [MD Anderson Cancer Center, Houston, TX (United States); Avery, S [University of Cape Town, Rondebosch (South Africa); Peterson, S

    2016-06-15

    Purpose: To report on the initial developments of a clinical 3-dimensional (3D) prompt gamma (PG) imaging system for proton radiotherapy range verification. Methods: The new imaging system under development consists of a prototype Compton camera to measure PG emission during proton beam irradiation and software to reconstruct, display, and analyze 3D images of the PG emission. For initial test of the system, PGs were measured with a prototype CC during a 200 cGy dose delivery with clinical proton pencil beams (ranging from 100 MeV – 200 MeV) to a water phantom. Measurements were also carried out with the CC placed 15 cm from the phantom for a full range 150 MeV pencil beam and with its range shifted by 2 mm. Reconstructed images of the PG emission were displayed by the clinical PG imaging software and compared to the dose distributions of the proton beams calculated by a commercial treatment planning system. Results: Measurements made with the new PG imaging system showed that a 3D image could be reconstructed from PGs measured during the delivery of 200 cGy of dose, and that shifts in the Bragg peak range of as little as 2 mm could be detected. Conclusion: Initial tests of a new PG imaging system show its potential to provide 3D imaging and range verification for proton radiotherapy. Based on these results, we have begun work to improve the system with the goal that images can be produced from delivery of as little as 20 cGy so that the system could be used for in-vivo proton beam range verification on a daily basis.

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL AND PARTICULATE CONTAMINANTS IN DRINKING WATER : SEPARMATIC™ FLUID SYSTEMS DIATOMACEOUS EARTH PRESSURE TYPE FILTER SYSTEM MODEL 12P-2

    Science.gov (United States)

    The verification test of the SeparmaticTM DE Pressure Type Filter System Model 12P-2 was conducted at the UNH Water Treatment Technology Assistance Center (WTTAC) in Durham, New Hampshire. The source water was finished water from the Arthur Rollins Treatment Plant that was pretr...

  19. Dosimetric verification of a dedicated 3D treatment planning system for episcleral plaque therapy

    International Nuclear Information System (INIS)

    Knutsen, Stig; Hafslund, Rune; Monge, Odd R.; Valen, Harald; Muren, Ludvig Paul; Rekstad, Bernt Louni; Krohn, Joergen; Dahl, Olav

    2001-01-01

    Purpose: Episcleral plaque therapy (EPT) is applied in the management of some malignant ocular tumors. A customized configuration of typically 4 to 20 radioactive seeds is fixed in a gold plaque, and the plaque is sutured to the scleral surface corresponding to the basis of the intraocular tumor, allowing for a localized radiation dose delivery to the tumor. Minimum target doses as high as 100 Gy are directed at malignant tumor sites close to critical normal tissues (e.g., optic disc and macula). Precise dosimetry is therefore fundamental for judging both the risk for normal tissue toxicity and tumor dose prescription. This paper describes the dosimetric verification of a commercially available dedicated treatment planning system (TPS) for EPT when realistic multiple-seed configurations are applied. Materials and Methods: The TPS Bebig Plaque Simulator is used to plan EPT at our institution. Relative dose distributions in a water phantom, including central axis depth dose and off-axis dose profiles for three different plaques, the University of Southern California (USC) No. 9 and the Collaborative Ocular Melanoma Study (COMS) 12-mm and 20-mm plaques, were measured with a diode detector. Each plaque was arranged with realistic multiple 125 I seed configurations. The measured dose distributions were compared to the corresponding dose profiles calculated with the TPS. All measurements were corrected for the angular sensitivity variation of the diode. Results: Single-seed dose distributions measured with our dosimetry setup agreed with previously published data within 3%. For the three multiple-seed plaque configurations, the measured and calculated dose distributions were in good agreement. For the central axis depth doses, the agreement was within 4%, whereas deviations up to 11% were observed in single points far off-axis. Conclusions: The Bebig Plaque Simulator is a reliable TPS for calculating relative dose distributions around realistic multiple 125 I seed

  20. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.