WorldWideScience

Sample records for model verification process

  1. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  2. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  3. Mashup Model and Verification Using Mashup Processing Network

    Science.gov (United States)

    Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude

    Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.

  4. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  5. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  6. The mediation of mothers' self-fulfilling effects on their children's alcohol use: self-verification, informational conformity, and modeling processes.

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard

    2008-08-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved

  7. The Mediation of Mothers’ Self-Fulfilling Effects on Their Children’s Alcohol Use: Self-Verification, Informational Conformity and Modeling Processes

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard

    2010-01-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708

  8. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  9. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  10. Experimental verification of the energetic model of the dry mechanical reclamation process

    Directory of Open Access Journals (Sweden)

    R. Dańko

    2008-04-01

    Full Text Available The experimental results of the dry mechanical reclamation process, which constituted the bases for the verification of the energetic model of this process, developed by the author on the grounds of the Rittinger’s deterministic hypothesis of the crushing process, are presented in the paper. Used foundry sands with bentonite, with water-glass from the floster technology and used sands with furan FL 105 resin were used in the reclamation tests. In the mechanical and mechanical-cryogenic reclamation a wide range of time variations and reclamation conditions influencing intensity of the reclamation process – covering all possible parameters used in industrial devices - were applied. The developed theoretical model constitutes a new tool allowing selecting optimal times for the reclamation treatment of the given spent foundry sand at the assumed process intensity realized in rotor reclaimers - with leaves or rods as grinding elements mounted horizontally on the rotor axis.

  11. BProVe: Tool support for business process verification

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    This demo introduces BProVe, a tool supporting automated verification of Business Process models. BProVe analysis is based on a formal operational semantics defined for the BPMN 2.0 modelling language, and is provided as a freely accessible service that uses open standard formats as input data...

  12. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  13. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  14. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  15. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  16. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  17. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    VandV for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a VandV process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, VandV cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model VandV is fundamentally different from software VandV. Code developers developing computer programs perform software VandV to ensure code correctness, reliability, and robustness. In model VandV, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model VandV guidelines and procedures. The expected outcome of the model VandV process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful VandV program. This objective is motivated by the need for

  18. A survey of formal business process verification : From soundness to variability

    NARCIS (Netherlands)

    Groefsema, Heerko; Bucur, Doina

    2013-01-01

    Formal verification of business process models is of interest to a number of application areas, including checking for basic process correctness, business compliance, and process variability. A large amount of work on these topics exist, while a comprehensive overview of the field and its directions

  19. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  20. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  1. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  2. Formal Verification of Effectiveness of Control Activities in Business Processes

    Science.gov (United States)

    Arimoto, Yasuhito; Iida, Shusaku; Futatsugi, Kokichi

    It has been an important issue to deal with risks in business processes for achieving companies' goals. This paper introduces a method for applying a formal method to analysis of risks and control activities in business processes in order to evaluate control activities consistently, exhaustively, and to give us potential to have scientific discussion on the result of the evaluation. We focus on document flows in business activities and control activities and risks related to documents because documents play important roles in business. In our method, document flows including control activities are modeled and it is verified by OTS/CafeOBJ Method that risks about falsification of documents are avoided by control activities in the model. The verification is done by interaction between humans and CafeOBJ system with theorem proving, and it raises potential to discuss the result scientifically because the interaction gives us rigorous reasons why the result is derived from the verification.

  3. Virtual reality verification of workplace design guidelines for the process plant control room

    International Nuclear Information System (INIS)

    Droeivoldsmo, Asgeir; Nystad, Espen; Helgar, Stein

    2001-02-01

    Early identification of potential human factors guideline-violations and corrective input into the design process is desired for efficient and cost-effective control room design. Virtual reality (VR) technology makes it possible to perform evaluation of the design of the control room at an early stage of the design process, but can we trust the results from such evaluations? This paper describes an experimental validation of a VR model against the real world in five different guideline verification tasks. Results indicate that guideline verification in the VR model can be done with satisfactory accuracy for a number of evaluations. However, some guideline categories require further development of measurement tools and use of a model with higher resolution than the model used in this study. (Author). 30 refs., 4 figs., 1 tab

  4. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  5. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  6. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  7. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  8. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  9. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  10. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  11. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  12. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  13. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  14. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  15. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  16. Verification of a three-dimensional resin transfer molding process simulation model

    Science.gov (United States)

    Fingerson, John C.; Loos, Alfred C.; Dexter, H. Benson

    1995-01-01

    Experimental evidence was obtained to complete the verification of the parameters needed for input to a three-dimensional finite element model simulating the resin flow and cure through an orthotropic fabric preform. The material characterizations completed include resin kinetics and viscosity models, as well as preform permeability and compaction models. The steady-state and advancing front permeability measurement methods are compared. The results indicate that both methods yield similar permeabilities for a plain weave, bi-axial fiberglass fabric. Also, a method to determine principal directions and permeabilities is discussed and results are shown for a multi-axial warp knit preform. The flow of resin through a blade-stiffened preform was modeled and experiments were completed to verify the results. The predicted inlet pressure was approximately 65% of the measured value. A parametric study was performed to explain differences in measured and predicted flow front advancement and inlet pressures. Furthermore, PR-500 epoxy resin/IM7 8HS carbon fabric flat panels were fabricated by the Resin Transfer Molding process. Tests were completed utilizing both perimeter injection and center-port injection as resin inlet boundary conditions. The mold was instrumented with FDEMS sensors, pressure transducers, and thermocouples to monitor the process conditions. Results include a comparison of predicted and measured inlet pressures and flow front position. For the perimeter injection case, the measured inlet pressure and flow front results compared well to the predicted results. The results of the center-port injection case showed that the predicted inlet pressure was approximately 50% of the measured inlet pressure. Also, measured flow front position data did not agree well with the predicted results. Possible reasons for error include fiber deformation at the resin inlet and a lag in FDEMS sensor wet-out due to low mold pressures.

  17. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  18. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  19. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  20. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  1. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  2. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  3. Behavior equivalence and compatibility of business process models with complex correspondences

    NARCIS (Netherlands)

    Weidlich, M.; Dijkman, R.M.; Weske, M.H.

    2012-01-01

    Once multiple models of a business process are created for different purposes or to capture different variants, verification of behaviour equivalence or compatibility is needed. Equivalence verification ensures that two business process models specify the same behaviour. Since different process

  4. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  5. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  6. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  7. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  8. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  9. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  10. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  11. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  12. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  13. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  14. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  15. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  16. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  17. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  18. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  19. TARDEC FIXED HEEL POINT (FHP): DRIVER CAD ACCOMMODATION MODEL VERIFICATION REPORT

    Science.gov (United States)

    2017-11-09

    Public Release Disclaimer: Reference herein to any specific commercial company, product , process, or service by trade name, trademark, manufacturer , or...not actively engaged HSI until MSB or the Engineering Manufacturing and Development (EMD) Phase, resulting in significant design and cost changes...and shall not be used for advertising or product endorsement purposes. TARDEC Fixed Heel Point (FHP): Driver CAD Accommodation Model Verification

  20. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  1. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  2. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  3. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  4. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  5. 300 Area Process Trenches Verification Package

    International Nuclear Information System (INIS)

    Lerch, J.A.

    1998-03-01

    The purpose of this verification package is to document achievement of the remedial action objectives for the 300 Area Process Trenches (300 APT) located within the 300-FF-1 Operable Unit (OU). The 300 APT became active in 1975 as a replacement for the North and South Process Pond system that is also part of the 300-FF-1 OU. The trenches received 300 Area process effluent from the uranium fuel fabrication facilities. Waste from the 300 Area laboratories that was determined to be below discharge limits based on monitoring performed at the 307 retention basin was also released to the trenches. Effluent flowed through the headworks sluice gates, down a concrete apron, and into the trenches. From the beginning of operations in 1975 until 1993, a continuous, composite sampler was located at the headwork structure to analyze process effluent at the point of discharge to the environment

  6. Constrained structural dynamic model verification using free vehicle suspension testing methods

    Science.gov (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  7. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    -Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...

  8. Calibration and Verification of the Hydraulic Model for Blue Nile River from Roseires Dam to Khartoum City

    Directory of Open Access Journals (Sweden)

    Kamal edin ELsidig Bashar

    2015-12-01

    Full Text Available This research represents a practical attempt applied to calibrate and verify a hydraulic model for the Blue Nile River. The calibration procedures are performed using the observed data for a previous period and comparing them with the calibration results while verification requirements are achieved with the application of the observed data for another future period and comparing them with the verification results. The study objective covered a relationship of the river terrain with the distance between the assumed points of the dam failures along the river length. The computed model values and the observed data should conform to the theoretical analysis and the overall verification performance of the model by comparing it with another set of data. The model was calibrated using data from gauging stations (Khartoum, Wad Medani, downstream Sennar, and downstream Roseires during the period from the 1st of May to 31 of October 1988 and the verification was done using the data of the same gauging stations for years 2003 and 2010 for the same period. The required available data from these stations were collected, processed and used in the model calibration. The geometry input files for the HEC-RAS models were created using a combination of ArcGIS and HEC-GeoRAS. The results revealed high correlation (R2 ˃ 0.9 between the observed and calibrated water levels in all gauging stations during 1988 and also high correlation between the observed and verification water levels was obtained in years 2003 and 2010. Verification results with the equation and degree of correlation can be used to predict future data of any expected data for the same stations.

  9. Verification of thermo-fluidic CVD reactor model

    International Nuclear Information System (INIS)

    Lisik, Z; Turczynski, M; Ruta, L; Raj, E

    2014-01-01

    Presented paper describes the numerical model of CVD (Chemical Vapour Deposition) reactor created in ANSYS CFX, whose main purpose is the evaluation of numerical approaches used to modelling of heat and mass transfer inside the reactor chamber. Verification of the worked out CVD model has been conducted with measurements under various thermal, pressure and gas flow rate conditions. Good agreement between experimental and numerical results confirms correctness of the elaborated model.

  10. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  11. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  12. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  13. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  14. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  16. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  17. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  18. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  19. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  20. NbTi Strands Verification for ITER PF CICC Process Qualification of CNDA

    Science.gov (United States)

    Liu, F.; Liu, H.; Liu, S.; Liu, B.; Lei, L.; Wu, Y.

    2014-05-01

    China is in charge of most of Poloidal Field (PF) conductors production for the International Thermonuclear Experimental Reactor (ITER). The execution for PF conductors shall be in three main phases. According to ITER Procurement Arrangement (PA), the Domestic Agency (DA) shall be required to verify the room and low temperature acceptance tests carried out by the strand suppliers. As the reference laboratory of Chinese DA (CNDA), the superconducting strands test laboratory of Institute of Plasma Physics, Chinese Academy of Sciences (ASIPP) was undertaking the task of strands verification for ITER conductors. The verification test includes: diameter, Nickel plating thickness, copper-to-non-copper volume ratio, twist pitch direction and length, standard critical current (IC) and resistive transition index (n), residual resistance ration (RRR), and hysteresis loss. 48 NbTi strands with 7 billets were supplied for the PF Cable-In-Conduit Conductor (CICC) process qualification. In total, 54 samples were measured. The verification level for PF CICC process qualification was 100%. The test method, facility and results of each item are described in detail in this publication.

  1. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  2. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  3. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  4. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  5. Linear models to perform treaty verification tasks for enhanced information security

    Energy Technology Data Exchange (ETDEWEB)

    MacGahan, Christopher J., E-mail: cmacgahan@optics.arizona.edu [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Sandia National Laboratories, Livermore, CA 94551 (United States); Kupinski, Matthew A. [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  6. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  7. Verification of the cross-section and depletion chain processing module of DRAGON 3.06

    International Nuclear Information System (INIS)

    Chambon, R.; Marleau, G.; Zkiek, A.

    2008-01-01

    In this paper we present a verification of the module of the lattice code DRAGON 3.06 used for processing microscopic cross-section libraries, including their associated depletion chain. This verification is performed by reprogramming the capabilities of DRAGON in another language (MATLAB) and testing them on different problems typical of the CANDU reactor. The verification procedure consists in first programming MATLAB m-files to read the different cross section libraries in ASCII format and to compute the reference cross-sections and depletion chains. The same information is also recovered from the output files of DRAGON (using different m-files) and the resulting cross sections and depletion chain are compared with the reference library, the differences being evaluated and tabulated. The results show that the cross-section calculations and the depletion chains are correctly processed in version 3.06 of DRAGON. (author)

  8. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Science.gov (United States)

    2013-05-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Hazardous Materials Safety Administration, DOT. ACTION: Notice of public meeting. SUMMARY: This notice is announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity...

  9. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  10. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  11. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  12. Predictive Simulation of Material Failure Using Peridynamics -- Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    AFRL-AFOSR-VA-TR-2016-0309 Predictive simulation of material failure using peridynamics- advanced constitutive modeling, verification , and validation... Self -explanatory. 8. PERFORMING ORGANIZATION REPORT NUMBER. Enter all unique alphanumeric report numbers assigned by the performing organization, e.g...for public release. Predictive simulation of material failure using peridynamics-advanced constitutive modeling, verification , and validation John T

  13. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  14. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  15. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  16. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  17. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  18. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  19. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  20. Advances in the Processing of VHR Optical Imagery in Support of Safeguards Verification

    International Nuclear Information System (INIS)

    Niemeyer, I.; Listner, C.; Canty, M.

    2015-01-01

    Under the Additional Protocol of the Non-Proliferation Treaty (NPT) complementing the safeguards agreements between States and the International Atomic Energy Agency, commercial satellite imagery, preferably acquired by very high-resolution (VHR) satellite sensors, is an important source of safeguards-relevant information. Satellite imagery can assist in the evaluation of site declarations, design information verification, the detection of undeclared nuclear facilities, and the preparation of inspections or other visits. With the IAEA's Geospatial Exploitation System (GES), satellite imagery and other geospatial information such as site plans of nuclear facilities are available for a broad range of inspectors, analysts and country officers. The demand for spatial information and new tools to analyze this data is growing, together with the rising number of nuclear facilities under safeguards worldwide. Automated computer-driven processing of satellite imagery could therefore add a big value in the safeguards verification process. These could be, for example, satellite imagery pre-processing algorithms specially developed for new sensors, tools for pixel or object-based image analysis, or geoprocessing tools that generate additional safeguards-relevant information. In the last decade procedures for automated (pre-) processing of satellite imagery have considerably evolved. This paper aims at testing some pixel-based and object-based procedures for automated change detection and classification in support of safeguards verification. Taking different nuclear sites as examples, these methods will be evaluated and compared with regard to their suitability to (semi-) automatically extract safeguards-relevant information. (author)

  1. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  2. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  3. Hybrid Control and Verification of a Pulsed Welding Process

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Larsen, Jesper Abildgaard; Izadi-Zamanabadi, Roozbeh

    Currently systems, which are desired to control, are becoming more and more complex and classical control theory objectives, such as stability or sensitivity, are often not sufficient to cover the control objectives of the systems. In this paper it is shown how the dynamics of a pulsed welding...... process can be reformulated into a timed automaton hybrid setting and subsequently properties such as reachability and deadlock absence is verified by the simulation and verification tool UPPAAL....

  4. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  5. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  6. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  7. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  8. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  9. VerifEYE: a real-time meat inspection system for the beef processing industry

    Science.gov (United States)

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  10. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  11. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  12. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    Science.gov (United States)

    2012-04-01

    AGARD doivent comporter la dénomination « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues, telles que le titre ...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des

  13. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  14. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with Microorganisms... § 381.94 Contamination with Microorganisms; process control verification criteria and testing; pathogen... maintaining process controls sufficient to prevent fecal contamination. FSIS shall take further action as...

  15. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  16. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  17. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  18. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  19. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  20. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - REMOVAL OF PRECURSORS TO DISINFECTION BY-PRODUCTS IN DRINKING WATER, PCI MEMBRANE SYSTEMS FYNE PROCESS MODEL ROP 1434 WITH AFC-30 NANOFILTRATON AT BARROW, AK - NSF 00/19/EPADW395

    Science.gov (United States)

    Equipment testing and verification of PCI Membrane Systems Inc. Fyne Process nanofiltraton systems Model ROP 1434 equipped with a C10 module containing AFC-30 tubular membranes was conducted from 3/16-5/11/2000 in Barrow, AS. The source water was a moderate alkalinity, moderately...

  2. Electric and mechanical basic parameters to elaborate a process for a technical verification of safety related design modifications

    International Nuclear Information System (INIS)

    Lamuno Fernandez, Mercedes; La Roca Mallofre, GISEL; Bano Azcon, Alberto

    2010-01-01

    This paper presents a systematic process to check a design in order to achieve all the requirements that regulations demand. Nuclear engineers must verify that a design is done according to the safety requirements, and this paper presents how we have elaborated a process to improve the technical project verification. For a faster, better and easier verification process, here we summarize how to select the electric and mechanical basic parameters, which ensure the correct project verification of safety related design modifications. This process considers different aspects, which guarantee that the design preserves the availability, reliability and functional capability of the Structures, Systems and Components needed to operate the Nuclear Power Station with security. Electric and mechanical reference parameters are identified and discussed as well as others related ones, which are critical to safety. The implementation procedure to develop tasks performed in any company that has a quality plan is a requirement. On the engineering business, it is important not to use the personal criteria to do a technical analysis of a project; although, many times it is the checker's criteria and knowledge responsibility to ensure the correct development of a design modification. Then, the checker capabilities are the basis of the modification verification. This kind of procedure's development is not easy, because in an engineering project with important technical contents, there are multiple scenarios, but lots of them have a common basis. If we can identify the technical common basis of these projects, we will make good project verification but there are many difficulties we can encounter along this process. (authors)

  3. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  4. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  5. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  6. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  7. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  8. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  9. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    Science.gov (United States)

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  10. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  11. Modelling coupled microbial processes in the subsurface: Model development, verification, evaluation and application

    Science.gov (United States)

    Masum, Shakil A.; Thomas, Hywel R.

    2018-06-01

    To study subsurface microbial processes, a coupled model which has been developed within a Thermal-Hydraulic-Chemical-Mechanical (THCM) framework is presented. The work presented here, focuses on microbial transport, growth and decay mechanisms under the influence of multiphase flow and bio-geochemical reactions. In this paper, theoretical formulations and numerical implementations of the microbial model are presented. The model has been verified and also evaluated against relevant experimental results. Simulated results show that the microbial processes have been accurately implemented and their impacts on porous media properties can be predicted either qualitatively or quantitatively or both. The model has been applied to investigate biofilm growth in a sandstone core that is subjected to a two-phase flow and variable pH conditions. The results indicate that biofilm growth (if not limited by substrates) in a multiphase system largely depends on the hydraulic properties of the medium. When the change in porewater pH which occurred due to dissolution of carbon dioxide gas is considered, growth processes are affected. For the given parameter regime, it has been shown that the net biofilm growth is favoured by higher pH; whilst the processes are considerably retarded at lower pH values. The capabilities of the model to predict microbial respiration in a fully coupled multiphase flow condition and microbial fermentation leading to production of a gas phase are also demonstrated.

  12. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  13. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  14. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  15. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  16. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  17. Automatic Generation of Object Models for Process Planning and Control Purposes using an International standard for Information Exchange

    Directory of Open Access Journals (Sweden)

    Petter Falkman

    2003-10-01

    Full Text Available In this paper a formal mapping between static information models and dynamic models is presented. The static information models are given according to an international standard for product, process and resource information exchange, (ISO 10303-214. The dynamic models are described as Discrete Event Systems. The product, process and resource information is automatically converted into product routes and used for simulation, controller synthesis and verification. A high level language, combining Petri nets and process algebra, is presented and used for speci- fication of desired routes. A main implication of the presented method is that it enables the reuse of process information when creating dynamic models for process control. This method also enables simulation and verification to be conducted early in the development chain.

  18. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  19. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  20. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    Science.gov (United States)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  1. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  3. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  4. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    Science.gov (United States)

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  5. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  6. Functional verification of a safety class controller for NPPs using a UVM register Model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyu Chull [Dept. of Applied Computer Engineering, Dankook University, Cheonan (Korea, Republic of)

    2014-06-15

    A highly reliable safety class controller for NPPs (Nuclear Power Plants) is mandatory as even a minor malfunction can lead to disastrous consequences for people, the environment or the facility. In order to enhance the reliability of a safety class digital controller for NPPs, we employed a diversity approach, in which a PLC-type controller and a PLD-type controller are to be operated in parallel. We built and used structured testbenches based on the classes supported by UVM for functional verification of the PLD-type controller designed for NPPs. We incorporated a UVM register model into the testbenches in order to increase the controllability and the observability of the DUT(Device Under Test). With the increased testability, we could easily verify the datapaths between I/O ports and the register sets of the DUT, otherwise we had to perform black box tests for the datapaths, which is very cumbersome and time consuming. We were also able to perform constrained random verification very easily and systematically. From the study, we confirmed the various advantages of using the UVM register model in verification such as scalability, reusability and interoperability, and set some design guidelines for verification of the NPP controllers.

  7. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  8. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    Science.gov (United States)

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  10. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  11. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr......Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...

  12. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  13. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  14. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  15. Formal modeling and verification of systems with self-x properties

    OpenAIRE

    Reif, Wolfgang

    2006-01-01

    Formal modeling and verification of systems with self-x properties / Matthias Güdemann, Frank Ortmeier and Wolfgang Reif. - In: Autonomic and trusted computing : third international conference, ATC 2006, Wuhan, China, September 3-6, 2006 ; proceedings / Laurence T. Yang ... (eds.). - Berlin [u.a.] : Springer, 2006. - S. 38-47. - (Lecture notes in computer science ; 4158)

  16. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  17. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  18. Space Weather Models and Their Validation and Verification at the CCMC

    Science.gov (United States)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  19. BProVe: A formal verification framework for business process models

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    Business Process Modelling has acquired increasing relevance in software development. Available notations, such as BPMN, permit to describe activities of complex organisations. On the one hand, this shortens the communication gap between domain experts and IT specialists. On the other hand...

  20. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  1. Translating Activity Diagram from Duration Calculus for Modeling of Real-Time Systems and its Formal Verification using UPPAAL and DiVinE

    Directory of Open Access Journals (Sweden)

    Muhammad Abdul Basit Ur Rehman

    2016-01-01

    Full Text Available The RTS (Real-Time Systems are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus implementaion based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost.

  2. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  3. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  4. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  5. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  6. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  10. The Fundamentals of the Air Sampler Calibration-Verification Process

    International Nuclear Information System (INIS)

    Gavila, F.M.

    2011-01-01

    The calibration of an air sampling instrument using a reference air flow calibrator requires attention to scientific detail in order to establish that the instrument's reported values are correctly stated and valid under the actual operating conditions of the air sampling instrument. The primary objective of an air flow calibration-verification is to ensure that the device under test (DUT) is within the manufacturer's stated accuracy range of temperature, pressure and humidity conditions under which the instrument was designed to operate. The DUT output values are compared to those obtained from a reference instrument (REF) measuring the sample physical parameter that the DUT is measuring. An accurate comparison of air flow rates or air volumes requires that the comparison of the DUT and REF values be made under the same temperature and pressure conditions. It is absolutely necessary that the REF be more accurate than the DUT; otherwise, it can not be considered a reference instrument. The REF should be at least twice as accurate and, if possible, it should be four times as accurate as the DUT. Upon confirmation that the DUT meets the manufacturer's accuracy criteria, the technician must place a calibration sticker or label indicating the date of calibration, the expiration date of the calibration and an authorized signature. If it is a limited-use instrument, the label should state the limited-use operating range. The serial number and model number of the instrument should also be shown on the calibration sticker. A specific calibration file for each instrument by serial number should be kept in the calibration laboratory file records. Instruments that display gas flow or gas volume values corrected to a reference temperature and pressure are very desirable. The ideal situation is when both the DUT and the REF output flow rate or volume values are at the same conditions of T and P. The calibration-verification is, then, a simple process. The credibility of an air

  11. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  12. GTE blade injection moulding modeling and verification of models during process approbation

    Science.gov (United States)

    Stepanenko, I. S.; Khaimovich, A. I.

    2017-02-01

    The simulation model for filling the mould was developed using Moldex3D, and it was experimentally verified in order to perform further optimization calculations of the moulding process conditions. The method described in the article allows adjusting the finite-element model by minimizing the airfoil profile difference between the design and experimental melt motion front due to the differentiated change of power supplied to heating elements, which heat the injection mould in simulation. As a result of calibrating the injection mould for the gas-turbine engine blade, the mean difference between the design melt motion profile and the experimental airfoil profile of no more than 4% was achieved.

  13. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  14. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  15. Engineering within the assembly, verification, and integration (AIV) process in ALMA

    Science.gov (United States)

    Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene

    2010-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered

  16. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  17. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  18. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  19. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  20. A Formal Verification Method of Function Block Diagram

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun; Jee, Eun Kyoung; Jeon, Seung Jae; Park, Gee Yong; Kwon, Kee Choon

    2007-01-01

    Programmable Logic Controller (PLC), an industrial computer specialized for real-time applications, is widely used in diverse control systems in chemical processing plants, nuclear power plants or traffic control systems. As a PLC is often used to implement safety, critical embedded software, rigorous safety demonstration of PLC code is necessary. Function block diagram (FBD) is a standard application programming language for the PLC and currently being used in the development of a fully-digitalized reactor protection system (RPS), which is called the IDiPS, under the KNICS project. Therefore, verification issue of FBD programs is a pressing problem, and hence is of great importance. In this paper, we propose a formal verification method of FBD programs; we defined FBD programs formally in compliance with IEC 61131-3, and then translate the programs into Verilog model, and finally the model is verified using a model checker SMV. To demonstrate the feasibility and effective of this approach, we applied it to IDiPS which currently being developed under KNICS project. The remainder of this paper is organized as follows. Section 2 briefly describes Verilog and Cadence SMV. In Section 3, we introduce FBD2V which is a tool implemented to support the proposed FBD verification framework. A summary and conclusion are provided in Section 4

  1. Nuclear power plant C and I design verification by simulation

    International Nuclear Information System (INIS)

    Storm, Joachim; Yu, Kim; Lee, D.Y

    2003-01-01

    An important part of the Advanced Boiling Water Reactor (ABWR) in the Taiwan NPP Lungmen Units no.1 and no.2 is the Full Scope Simulator (FSS). The simulator was to be built according to design data and therefore, apart from the training aspect, a major part of the development is to apply a simulation based test bed for the verification, validation and improvement of plant design in the control and instrumentation (C and I) areas of unit control room equipment, operator Man Machine Interface (MMI), process computer functions and plant procedures. Furthermore the Full Scope Simulator will be used after that to allow proper training of the plant operators two years before Unit no.1 fuel load. The article describes scope, methods and results of the advanced verification and validation process and highlights the advantages of test bed simulation for real power plant design and implementation. Subsequent application of advanced simulation software tools like instrumentation and control translators, graphical model builders, process models, graphical on-line test tools and screen based or projected soft panels, allowed a team to fulfil the task of C and I verification in time before the implementation of the Distributed Control and Information System (DCIS) started. An additional area of activity was the Human Factors Engineering (HFE) for the operator MMI. Due to the fact that the ABWR design incorporates a display-based operation with most of the plant components, a dedicated verification and validation process is required by NUREG-0711. In order to support this activity an engineering test system had been installed for all the necessary HFE investigations. All detected improvements had been properly documented and used to update the plant design documentation by a defined process. The Full Scope Simulator (FSS) with hard panels and stimulated digital control and information system are in the final acceptance test process with the end customer, Taiwan Power Company

  2. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    Science.gov (United States)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  3. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  4. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  5. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    . The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...

  6. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  7. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  8. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  9. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  10. Towards a Generic Information Data Model for Verification, Validation & Accreditation VV&A

    NARCIS (Netherlands)

    Roza, Z.C.; Voogd, J.M.; Giannoulis, C.

    2008-01-01

    The Generic Methodology for Verification, Validation and Acceptance (GM-VV) is intended to provide a common generic framework for making formal and well balanced acceptance decisions on a specific usage of models, simulations and data. GM-VV will offer the international M&S community with a

  11. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  12. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance

  13. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    International Nuclear Information System (INIS)

    Mueller, Christina; Oeberg, Tomas

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  14. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  15. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  16. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    Science.gov (United States)

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  18. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  19. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  20. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  1. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  2. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  3. Sealing of process valves for the HEU downblending verification experiment at Portsmouth

    International Nuclear Information System (INIS)

    Baldwin, G.T.; Bartberger, J.C.; Jenkins, C.D.; Perlinski, A.W.; Schoeneman, J.L.; Gordon, D.M.; Whiting, N.E.; Bonner, T.N.; Castle, J.M.

    1998-01-01

    At the Portsmouth Gaseous Diffusion Plant in Piketon, Ohio, USA, excess inventory of highly-enriched uranium (HEU) from US defense programs is being diluted to low-enriched uranium (LEU) for commercial use. The conversion is subject to a Verification Experiment overseen by the International Atomic Energy Agency (IAEA). The Verification Experiment is making use of monitoring technologies developed and installed by several DOE laboratories. One of the measures is a system for sealing valves in the process piping, which secures the path followed by uranium hexafluoride gas (UF 6 ) from cylinders at the feed stations to the blend point, where the HEU is diluted with LEU. The Authenticated Item Monitoring System (AIMS) was the alternative proposed by Sandia National Laboratories that was selected by the IAEA. Approximately 30 valves were sealed by the IAEA using AIMS fiber-optic seals (AFOS). The seals employ single-core plastic fiber rated to 125 C to withstand the high-temperature conditions of the heated piping enclosures at Portsmouth. Each AFOS broadcasts authenticated seal status and state-of-health messages via a tamper-protected radio-frequency transmitter mounted outside of the heated enclosure. The messages are received by two collection stations, operated redundantly

  4. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  5. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  6. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  7. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    .33-kilometer domain model performance for the 2014 warm season (May-September). Verification statistics were computed using the Model Evaluation Tools, which compared the model forecasts to observations. The mean error values were close to 0 and the root mean square error values were less than 1.8 for mean sea-level pressure (millibars), temperature (degrees Kelvin), dewpoint temperature (degrees Kelvin), and wind speed (per millisecond), all very small differences between the forecast and observations considering the normal magnitudes of the parameters. The precipitation forecast verification results showed consistent under-forecasting of the precipitation object size. This could be an artifact of calculating the statistics for each hour rather than for the entire 12-hour period. The AMU will continue to generate verification statistics for the 1.33-kilometer WRF-EMS domain as data become available in future cool and warm seasons. More data will produce more robust statistics and reveal a more accurate assessment of model performance. Once the formal task was complete, the AMU conducted additional work to better understand the wind direction results. The results were stratified diurnally and by wind speed to determine what effects the stratifications would have on the model wind direction verification statistics. The results are summarized in the addendum at the end of this report. In addition to verifying the model's performance, the AMU also made the output available in the Advanced Weather Interactive Processing System II (AWIPS II). This allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations AWIPS II client computers and conduct real-time subjective analyses. In the future, the AMU will implement an updated version of the WRF-EMS model that incorporates local data assimilation. This model will also run in real-time and be made available in AWIPS II.

  8. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  9. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  10. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  11. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  12. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  13. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  14. Fuzzy Verification of Lower Dimensional Information in a Numerical Simulation of Sea Ice

    Science.gov (United States)

    Sulsky, D.; Levy, G.

    2010-12-01

    Ideally, a verification and validation scheme should be able to evaluate and incorporate lower dimensional features (e.g., discontinuities) contained within a bulk simulation even when not directly observed or represented by model variables. Nonetheless, lower dimensional features are often ignored. Conversely, models that resolve such features and the associated physics well, yet imprecisely are penalized by traditional validation schemes. This can lead to (perceived or real) poor model performance and predictability and can become deleterious in model improvements when observations are sparse, fuzzy, or irregular. We present novel algorithms and a general framework for using information from available satellite data through fuzzy verification that efficiently and effectively remedy the known problems mentioned above. As a proof of concept, we use a sea-ice model with remotely sensed observations of leads in a one-step initialization cycle. Using the new scheme in a sixteen day simulation experiment introduces model skill (against persistence) several days earlier than in the control run, improves the overall model skill and delays its drop off at later stages of the simulation. Although sea-ice models are currently a weak link in climate models, the appropriate choice of data to use, and the fuzzy verification and evaluation of a system’s skill in reproducing lower dimensional features are important beyond the initial application to sea ice. Our strategy and framework for fuzzy verification, selective use of information, and feature extraction could be extended globally and to other disciplines. It can be incorporated in and complement existing verification and validation schemes, increasing their computational efficiency and the information they use. It can be used for model development and improvements, upscaling/downscaling models, and for modeling processes not directly represented by model variables or direct observations. Finally, if successful, it can

  15. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  16. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  17. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  18. Hearing aids in children: the importance of the verification and validation processes.

    Science.gov (United States)

    Rissatto, Mara Renata; Novaes, Beatriz Cavalcanti de Albuquerque Caiuby

    2009-01-01

    during the fitting of hearing aids in children it is important, besides using a verification protocol, to have a validation process. to describe and discuss the use of a protocol for the fitting and the verification of hearing aids in children, as well as the impact of the adjustment of the acoustic characteristics in speech perception tasks. ten children aging from three to eleven years were enrolled in this study. All children presented bilateral sensorineural hearing impairment, were users of hearing aids and were followed at a public hearing health care service in Bahia. The children were submitted to the following procedures: pure tone air and bone conduction thresholds; real-ear coupler difference (RECD); verification with real-ear measurement equipment: coupler gain/output and insertion gain and to speech perception tasks: 'The Six-Sound Test' (Ling, 2006) and the 'Word Associations for Syllable Perception' (WASP - Koch, 1999). The programmed electro acoustic characteristics of the hearing aids were compared to the electro acoustic characteristics prescribed by the DSL [i/o] v4.1 software. The speech perception tasks were reapplied on three occasions: straight after the modification of the electro acoustic characteristics, after 30 days and 60 days. for more than 50% of the tested children, the programmed electro acoustic characteristics of the hearing aids did not correspond to that suggested by the DSL [i/o] software. Adequate prescription was verified in 70% of the investigated sample; this was also confirmed by the results in the speech perception tasks (p=0.000). This data confirmed that the mean percentage of correct answers increased after the modification of the electro acoustic characteristics. the use of a protocol that verifies and validates the fitting of hearing aids in children is necessary.

  19. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  20. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    of Iraq was a case of late detection of undeclared activities, the case of DPRK was a case of prompt detection of discrepancies in the initial declaration through implementation of modem detection techniques, such as environmental sampling, and access to information. Access to the Security Council became important in view of the protracted process of non-compliance. The Model Additional Protocol (INFCIRC 540) agreed in 1997 incorporates the results of the efforts to strengthen the safeguards system and as such provides the possibility for more transparency by the States and more access to locations by the inspectors on the basis of information. It does not provide the broad and intrusive access rights as in the case of Iraq, since such rights are unprecedented and the result of a cease-fire arrangement involving the Security Council. But the expectations are that the broad implementation of the Additional Protocol will result in an effective and efficient safeguards verification system for the future. The on-site verification systems on a national, regional or multinational basis that have been put into operation in the past or are being discussed by States for the implementation of disarmament and non-proliferation conventions related to weapons of mass destruction whether nuclear, chemical or biological, have benefited and will benefit in the future from the guiding experience - both from the strengths and weaknesses -of the IAEA verification system. This is hopefully a legacy for the future of verification

  1. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  2. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    Science.gov (United States)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  3. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  4. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  5. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    Science.gov (United States)

    Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2015-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material

  6. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  7. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  8. An analytical model of SAGD process considering the effect of threshold pressure gradient

    Science.gov (United States)

    Morozov, P.; Abdullin, A.; Khairullin, M.

    2018-05-01

    An analytical model is proposed for the development of super-viscous oil deposits by the method of steam-assisted gravity drainage, taking into account the nonlinear filtration law with the limiting gradient. The influence of non-Newtonian properties of oil on the productivity of a horizontal well and the cumulative steam-oil ratio are studied. Verification of the proposed model based on the results of physical modeling of the SAGD process was carried out.

  9. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  10. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  11. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  12. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  13. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  14. Building a Simulated Environment for the Study of Multilateral Approaches to Nuclear Materials Verification

    International Nuclear Information System (INIS)

    Moul, R.; Persbo, A.; Keir, D.

    2015-01-01

    Verification research can be resource-intensive, particularly when it relies on practical or field exercises. These exercises can also involve substantial logistical preparations and are difficult to run in an iterative manner to produce data sets that can be later utilized in verification research. This paper presents the conceptual framework, methodology and preliminary findings from part of a multi-year research project, led by VERTIC. The multi-component simulated environment that we have generated, using existing computer models for nuclear reactors and other components of fuel cycles, can be used to investigate options for future multilateral nuclear verification, at a variety of locations and time points in a nuclear complex. We have constructed detailed fuel cycle simulations for two fictional, and very different, states. In addition to these mass-flow models, a 3-dimensional, avatarbased simulation of a nuclear facility is under development. We have also developed accompanying scenarios-that provide legal and procedural assumptions that will control the process of our fictional verification solutions. These tools have all been produced using open source information and software. While these tools are valuable for research purposes, they can also play an important role in support of training and education in the field of nuclear materials verification, in a variety of settings and circumstances. (author)

  15. Design verification testing for fuel element type CAREM

    International Nuclear Information System (INIS)

    Martin Ghiselli, A.; Bonifacio Pulido, K.; Villabrille, G.; Rozembaum, I.

    2013-01-01

    The hydraulic and hydrodynamic characterization tests are part of the design verification process of a nuclear fuel element prototype and its components. These tests are performed in a low pressure and temperature facility. The tests requires the definition of the simulation parameters for setting the test conditions, the results evaluation to feedback mathematical models, extrapolated the results to reactor conditions and finally to decide the acceptability of the tested prototype. (author)

  16. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  17. Field verification of advanced transport models of radionuclides in heterogeneous soils

    International Nuclear Information System (INIS)

    Visser, W.; Meurs, G.A.M.; Weststrate, F.A.

    1991-01-01

    This report deals with a verification study of advanced transport models of radionuclides in heterogeneous soils. The study reported here is the third phase of a research program carried out by Delft Geotechnics concerning the influence of soil heterogeneities on the migration of radionuclides in the soil and soil-water system. Phases 1 and 2 have been reported earlier in the EC Nuclear Science and technology series (EUR 12111 EN, 1989). The verification study involves the predictive modelling of a field tracer experiment carried out by the British Geological Survey (BGS) at Drigg, Cumbria (UK). Conservative (I 131 , Cl-, H 3 ) as well as non-conservative (Co-EDTA) tracers were used. The inverse modelling shows that micro dispersion may be considered as a soil constant related to grainsize. Micro dispersion shows a slow increase with distance from the source. This increase is caused by mass transfer between adjacent layers of different permeability. Macro dispersion is observed when sampling over a larger interval then permitted by the detail with which the heterogeneity is described in the model. The prediction of the migration of radionuclides through heterogeneous soils is possible. The advection dispersion equation seems to be an adequate description of the migration of conservative tracers. The models based on this equation give comparable results on a small field test scale (3.5 m). The prediction of the migration of adsorbing species is more difficult. The mathematical descriptions seem appropriate, but the heterogeneity in soils seems to create a higher order of uncertainty which can not be described as yet with calculation strategies available at this moment

  18. Verification Testing: Meet User Needs Figure of Merit

    Science.gov (United States)

    Kelly, Bryan W.; Welch, Bryan W.

    2017-01-01

    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible

  19. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  20. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  1. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  2. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  3. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  4. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  5. Using Reversed MFCC and IT-EM for Automatic Speaker Verification

    Directory of Open Access Journals (Sweden)

    Sheeraz Memon

    2012-01-01

    Full Text Available This paper proposes text independent automatic speaker verification system using IMFCC (Inverse/ Reverse Mel Frequency Coefficients and IT-EM (Information Theoretic Expectation Maximization. To perform speaker verification, feature extraction using Mel scale has been widely applied and has established better results. The IMFCC is based on inverse Mel-scale. The IMFCC effectively captures information available at the high frequency formants which is ignored by the MFCC. In this paper the fusion of MFCC and IMFCC at input level is proposed. GMMs (Gaussian Mixture Models based on EM (Expectation Maximization have been widely used for classification of text independent verification. However EM comes across the convergence issue. In this paper we use our proposed IT-EM which has faster convergence, to train speaker models. IT-EM uses information theory principles such as PDE (Parzen Density Estimation and KL (Kullback-Leibler divergence measure. IT-EM acclimatizes the weights, means and covariances, like EM. However, IT-EM process is not performed on feature vector sets but on a set of centroids obtained using IT (Information Theoretic metric. The IT-EM process at once diminishes divergence measure between PDE estimates of features distribution within a given class and the centroids distribution within the same class. The feature level fusion and IT-EM is tested for the task of speaker verification using NIST2001 and NIST2004. The experimental evaluation validates that MFCC/IMFCC has better results than the conventional delta/MFCC feature set. The MFCC/IMFCC feature vector size is also much smaller than the delta MFCC thus reducing the computational burden as well. IT-EM method also showed faster convergence, than the conventional EM method, and thus it leads to higher speaker recognition scores.

  6. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    Science.gov (United States)

    Rosen, Lisa H; Principe, Connor P; Langlois, Judith H

    2013-02-13

    The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.

  7. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  8. Game Theory Models for the Verification of the Collective Behaviour of Autonomous Cars

    OpenAIRE

    Varga, László Z.

    2017-01-01

    The collective of autonomous cars is expected to generate almost optimal traffic. In this position paper we discuss the multi-agent models and the verification results of the collective behaviour of autonomous cars. We argue that non-cooperative autonomous adaptation cannot guarantee optimal behaviour. The conjecture is that intention aware adaptation with a constraint on simultaneous decision making has the potential to avoid unwanted behaviour. The online routing game model is expected to b...

  9. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  10. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  11. Self-verification and social anxiety: preference for negative social feedback and low social self-esteem.

    Science.gov (United States)

    Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A

    2011-10-01

    A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.

  12. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  13. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  14. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  15. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  16. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  17. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT LASER TOUCH AND TECHNOLOGIES, LLC LASER TOUCH MODEL LT-B512

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Laser Touch model LT-B512 targeting device manufactured by Laser Touch and Technologies, LLC, for manual spray painting operations. The relative transfer efficiency (TE) improved an avera...

  19. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  20. Verification of SAP reference models

    NARCIS (Netherlands)

    Dongen, van B.F.; Jansen-Vullers, M.H.; Aalst, van der W.M.P.; Benatallah, B.; Casati, F.

    2005-01-01

    To configure a process-aware information system (e.g., a workflow system, an ERP system), a business model needs to be transformed into an executable process model. Due to similarities in these transformations for different companies, databases with reference models, such as ARIS for MySAP, have

  1. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  2. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  3. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  4. Modeling the Creep of Rib-Reinforced Composite Media Made from Nonlinear Hereditary Phase Materials 2. Verification of the Model

    Science.gov (United States)

    Yankovskii, A. P.

    2015-05-01

    An indirect verification of a structural model describing the creep of a composite medium reinforced by honeycombs and made of nonlinear hereditary phase materials obeying the Rabotnov theory of creep is presented. It is shown that the structural model proposed is trustworthy and can be used in practical calculations. For different kinds of loading, creep curves for a honeycomb core made of a D16T aluminum alloy are calculated.

  5. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  6. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  7. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  8. Software verification, model validation, and hydrogeologic modelling aspects in nuclear waste disposal system simulations. A paradigm shift

    International Nuclear Information System (INIS)

    Sheng, G.M.

    1994-01-01

    This work reviewed the current concept of nuclear waste disposal in stable, terrestrial geologic media with a system of natural and man-made multi-barriers. Various aspects of this concept and supporting research were examined with the emphasis on the Canadian Nuclear Fuel Waste Management Program. Several of the crucial issues and challenges facing the current concept were discussed. These include: The difficulties inherent in a concept that centres around lithologic studies; the unsatisfactory state of software quality assurance in the present computer simulation programs; and the lack of a standardized, comprehensive, and systematic procedure to carry out a rigorous process of model validation and assessment of simulation studies. An outline of such an approach was presented and some of the principles, tools and techniques for software verification were introduced and described. A case study involving an evaluation of the Canadian performance assessment computer program is presented. A new paradigm to nuclear waste disposal was advocated to address the challenges facing the existing concept. The RRC (Regional Recharge Concept) was introduced and its many advantages were described and shown through a modelling exercise. (orig./HP)

  9. Treatment of dynamical processes in two-dimensional models of the troposphere and stratosphere

    International Nuclear Information System (INIS)

    Wuebbles, D.J.

    1980-07-01

    The physical structure of the troposphere and stratosphere is the result of an intricate interplay among a large number of radiative, chemical, and dynamical processes. Because it is not possible to model the global environment in the laboratory, theoretical models must be relied on, subject to observational verification, to simulate atmospheric processes. Of particular concern in recent years has been the modeling of those processes affecting the structure of ozone and other trace species in the stratosphere and troposphere. Zonally averaged two-dimensional models with spatial resolution in the vertical and meridional directions can provide a much more realistic representation of tracer transport than one-dimensional models, yet are capable of the detailed representation of chemical and radiative processes contained in the one-dimensional models. The purpose of this study is to describe and analyze existing approaches to representing global atmospheric transport processes in two-dimensional models and to discuss possible alternatives to these approaches. A general description of the processes controlling the transport of trace constituents in the troposphere and stratosphere is given

  10. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    is extended with stochastic branching, message passing and reward annotations which allow for the modelling of resources consumed during the execution of a business process. Further, it is shown how this structure can be used to formalise the established business process modelling language Business Process...... fault tree analysis and the automated optimisation of business processes by means of an evolutionary algorithm. This work is motivated by problems that stem from the healthcare sector, and examples encountered in this field are used to illustrate these developments....

  11. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  12. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  13. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  14. Modeling of geochemical processes in the submarine discharge zone of hydrothermal solutions

    Directory of Open Access Journals (Sweden)

    С. М. Судариков

    2017-06-01

    Full Text Available The paper reviews the main methods and analyzes modeling results for geochemical processes in the submarine discharge zone of hydrothermal solutions of mid-ocean ridges. Initial data for modeling have been obtained during several marine expeditions, including Russian-French expedition SERPENTINE on the research vessel «Pourquoi Рas?» (2007. Results of field observations, laboratory experiments and theoretical developments are supported by the analysis of regression model of mixing between hydrothermal solutions and sea water. Verification of the model has been carried out and the quality of chemical analysis has been assessed; degree and character of participation of solution components in the hydrothermal process have been defined; the content of end members has been calculated basing on reverse forecasting of element concentration, depending on regression character; data for thermodynamic modeling have been prepared. Regression model of acid-base properties and chloridity of mineralizing thermal springs confirms adequacy of the model of double-diffusive convection for forming the composition of hydrothermal solutions.  Differentiation of solutions according to concentrations of chloride-ion, depending on temperature and pH indicator within this model, is associated with phase conversions and mixing of fluids from two convection cells, one of which is a zone of brine circulation. In order to carry out computer thermodynamic modeling, hydro-geochemical and physicochemical models of hydrothermal discharge zone have been created. Verification of the model has been carried out basing on changes of Mn concentration in the hydrothermal plume. Prevailing forms of Mn migration in the plume are Mn2+, MnCl+, MnCl2. Two zones have been identified in the geochemical structure of the plume: 1 high-temperature zone (350-100 °С with prevalence of chloride complexes – ascending plume; 2 low-temperature zone (100-2 °С, where predominant form of

  15. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  16. A pilot scale demonstration of the DWPF process control and product verification strategy

    International Nuclear Information System (INIS)

    Hutson, N.D.; Jantzen, C.M.; Beam, D.C.

    1992-01-01

    The Defense Waste Processing Facility (DWPF) has been designed and constructed to immobilize Savannah River Site high level liquid waste within a durable borosilicate glass matrix for permanent storage. The DWPF will be operated to produce a glass product which must meet a number of product property constraints which are dependent upon the final product composition. During actual operations, the DWPF will control the properties of the glass product by the controlled blending of the waste streams with a glass-forming frit to produce the final melter feed slurry. The DWPF will verify control of the glass product through analysis of vitrified samples of slurry material. In order to demonstrate the DWPF process control and product verification strategy, a pilot-scale vitrification research facility was operated in three discrete batches using simulated DWPF waste streams. All of the DWPF process control methodologies were followed and the glass produce from each experiment was leached according to the Product Consistency Test. Results of the campaign are summarized

  17. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  18. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  19. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  20. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  1. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  2. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  3. Caliver: An R package for CALIbration and VERification of forest fire gridded model outputs.

    Science.gov (United States)

    Vitolo, Claudia; Di Giuseppe, Francesca; D'Andrea, Mirko

    2018-01-01

    The name caliver stands for CALIbration and VERification of forest fire gridded model outputs. This is a package developed for the R programming language and available under an APACHE-2 license from a public repository. In this paper we describe the functionalities of the package and give examples using publicly available datasets. Fire danger model outputs are taken from the modeling components of the European Forest Fire Information System (EFFIS) and observed burned areas from the Global Fire Emission Database (GFED). Complete documentation, including a vignette, is also available within the package.

  4. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  5. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  6. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  7. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  8. Model checking as an aid to procedure design

    International Nuclear Information System (INIS)

    Zhang, Wenhu

    2001-01-01

    The OECD Halden Reactor Project has been actively working on computer assisted operating procedures for many years. The objective of the research has been to provide computerised assistance for procedure design, verification and validation, implementation and maintenance. For the verification purpose, the application of formal methods has been considered in several reports. The recent formal verification activity conducted at the Halden Project is based on using model checking to the verification of procedures. This report presents verification approaches based on different model checking techniques and tools for the formalization and verification of operating procedures. Possible problems and relative merits of the different approaches are discussed. A case study of one of the approaches is presented to show the practical application of formal verification. Application of formal verification in the traditional procedure design process can reduce the human resources involved in reviews and simulations, and hence reduce the cost of verification and validation. A discussion of the integration of the formal verification with the traditional procedure design process is given at the end of this report. (Author)

  9. Identification and verification of critical performance dimensions. Phase 1 of the systematic process redesign of drug distribution.

    Science.gov (United States)

    Colen, Hadewig B; Neef, Cees; Schuring, Roel W

    2003-06-01

    Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.

  10. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  11. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  12. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  13. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  14. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  15. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  16. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  17. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  18. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  19. Design and Verification of Fault-Tolerant Components

    DEFF Research Database (Denmark)

    Zhang, Miaomiao; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    We present a systematic approach to design and verification of fault-tolerant components with real-time properties as found in embedded systems. A state machine model of the correct component is augmented with internal transitions that represent hypothesized faults. Also, constraints...... to model and check this design. Model checking uses concrete parameters, so we extend the result with parametric analysis using abstractions of the automata in a rigorous verification....... relatively detailed such that they can serve directly as blueprints for engineering, and yet be amenable to exhaustive verication. The approach is illustrated with a design of a triple modular fault-tolerant system that is a real case we received from our collaborators in the aerospace field. We use UPPAAL...

  20. Design and implementation of embedded hardware accelerator for diagnosing HDL-CODE in assertion-based verification environment

    Directory of Open Access Journals (Sweden)

    C. U. Ngene

    2013-08-01

    Full Text Available The use of assertions for monitoring the designer’s intention in hardware description language (HDL model is gaining popularity as it helps the designer to observe internal errors at the output ports of the device under verification. During verification assertions are synthesised and the generated data are represented in a tabular forms. The amount of data generated can be enormous depending on the size of the code and the number of modules that constitute the code. Furthermore, to manually inspect these data and diagnose the module with functional violation is a time consuming process which negatively affects the overall product development time. To locate the module with functional violation within acceptable diagnostic time, the data processing and analysis procedure must be accelerated. In this paper a multi-array processor (hardware accelerator was designed and implemented in Virtex6 field programmable gate array (FPGA and it can be integrated into verification environment. The design was captured in very high speed integrated circuit HDL (VHDL. The design was synthesised with Xilinx design suite ISE 13.1 and simulated with Xilinx ISIM. The multi-array processor (MAP executes three logical operations (AND, OR, XOR and a one’s compaction operation on array of data in parallel. An improvement in processing and analysis time was recorded as compared to the manual procedure after the multi-array processor was integrated into the verification environment. It was also found that the multi-array processor which was developed as an Intellectual Property (IP core can also be used in applications where output responses and golden model that are represented in the form of matrices can be compared for searching, recognition and decision-making.

  1. Formal Specification and Verification of Fully Asynchronous Implementations of the Data Encryption Standard

    Directory of Open Access Journals (Sweden)

    Wendelin Serwe

    2015-11-01

    Full Text Available This paper presents two formal models of the Data Encryption Standard (DES, a first using the international standard LOTOS, and a second using the more recent process calculus LNT. Both models encode the DES in the style of asynchronous circuits, i.e., the data-flow blocks of the DES algorithm are represented by processes communicating via rendezvous. To ensure correctness of the models, several techniques have been applied, including model checking, equivalence checking, and comparing the results produced by a prototype automatically generated from the formal model with those of existing implementations of the DES. The complete code of the models is provided as appendices and also available on the website of the CADP verification toolbox.

  2. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  3. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  4. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  5. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  6. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  7. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  8. A pilot scale demonstration of the DWPF process control and product verification strategy

    International Nuclear Information System (INIS)

    Hutson, N.D.; Jantzen, C.M.; Beam, D.C.

    1992-01-01

    The Defense Waste Processing Facility (DWPF) has been designed and constructed to immobilize Savannah River Site high level liquid waste within a durable borosilicate glass matrix for permanent storage. The DWPF will be operated to produce a glass product which must meet a number of product property constraints which are dependent upon the final product composition. During actual operations, the DWPF will control the properties of the glass product by the controlled blending of the waste streams with a glass-forming frit to produce the final melter feed slurry. The DWPF will verify control of the glass product through analysis of vitrified samples of slurry material. In order to demonstrate the DWPF process control and product verification strategy, a pilot-scale vitrification research facility was operated in three discrete batches using simulated DWPF waste streams. All of the DWPF process control methodologies were followed and the glass product from each experiment was leached according to the Product Consistency Test. In this paper results of the campaign are summarized

  9. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  10. Report on Stage 1 of project CHEMVAL/MIRAGE: verification of speciation models

    International Nuclear Information System (INIS)

    Read, D.; Broyd, T.W.

    1989-01-01

    This report describes the results of CHEMVAL Stage 1, an international chemical model verification exercise involving the active participation of 14 organisations within the CEC countries, Sweden, Switzerland and Finland. Five case systems were studied, namely, cement, clay, sandstone, granite and limestone. Overall, good agreement was obtained even for conceptually difficult geochemical simulations. Reasons for divergence in results have been explored and recommendations are made at the appropriate stages for enhancement of the thermodynamic database. A listing of the preliminary CHEMVAL Project Database is provided. (author)

  11. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  12. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  13. Operational flood-forecasting in the Piemonte region – development and verification of a fully distributed physically-oriented hydrological model

    Directory of Open Access Journals (Sweden)

    D. Rabuffetti

    2009-03-01

    Full Text Available A hydrological model for real time flood forecasting to Civil Protection services requires reliability and rapidity. At present, computational capabilities overcome the rapidity needs even when a fully distributed hydrological model is adopted for a large river catchment as the Upper Po river basin closed at Ponte Becca (nearly 40 000 km2. This approach allows simulating the whole domain and obtaining the responses of large as well as of medium and little sized sub-catchments. The FEST-WB hydrological model (Mancini, 1990; Montaldo et al., 2007; Rabuffetti et al., 2008 is implemented. The calibration and verification activities are based on more than 100 flood events, occurred along the main tributaries of the Po river in the period 2000–2003. More than 300 meteorological stations are used to obtain the forcing fields, 10 cross sections with continuous and reliable discharge time series are used for calibration while verification is performed on about 40 monitored cross sections. Furthermore meteorological forecasting models are used to force the hydrological model with Quantitative Precipitation Forecasts (QPFs for 36 h horizon in "operational setting" experiments. Particular care is devoted to understanding how QPF affects the accuracy of the Quantitative Discharge Forecasts (QDFs and to assessing the QDF uncertainty impact on the warning system reliability. Results are presented either in terms of QDF and of warning issues highlighting the importance of an "operational based" verification approach.

  14. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  15. Calibration and verification of models of organic carbon removal kinetics in Aerated Submerged Fixed-Bed Biofilm Reactors (ASFBBR): a case study of wastewater from an oil-refinery.

    Science.gov (United States)

    Trojanowicz, Karol; Wójcik, Włodzimierz

    2011-01-01

    The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.

  16. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  17. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  18. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  19. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    Science.gov (United States)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  20. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  1. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  2. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  3. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  4. Functions of social support and self-verification in association with loneliness, depression, and stress.

    Science.gov (United States)

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  5. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  6. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  7. Dense time discretization technique for verification of real time systems

    International Nuclear Information System (INIS)

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  8. Simulation based mask defect repair verification and disposition

    Science.gov (United States)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  9. TLM.open: a SystemC/TLM Frontend for the CADP Verification Toolbox

    Directory of Open Access Journals (Sweden)

    Claude Helmstetter

    2014-04-01

    Full Text Available SystemC/TLM models, which are C++ programs, allow the simulation of embedded software before hardware low-level descriptions are available and are used as golden models for hardware verification. The verification of the SystemC/TLM models is an important issue since an error in the model can mislead the system designers or reveal an error in the specifications. An open-source simulator for SystemC/TLM is provided but there are no tools for formal verification.In order to apply model checking to a SystemC/TLM model, a semantics for standard C++ code and for specific SystemC/TLM features must be provided. The usual approach relies on the translation of the SystemC/TLM code into a formal language for which a model checker is available.We propose another approach that suppresses the error-prone translation effort. Given a SystemC/TLM program, the transitions are obtained by executing the original code using g++ and an extended SystemC library, and we ask the user to provide additional functions to store the current model state. These additional functions generally represent less than 20% of the size of the original model, and allow it to apply all CADP verification tools to the SystemC/TLM model itself.

  10. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  11. BEval: A Plug-in to Extend Atelier B with Current Verification Technologies

    Directory of Open Access Journals (Sweden)

    Valério Medeiros Jr.

    2014-01-01

    Full Text Available This paper presents BEval, an extension of Atelier B to improve automation in the verification activities in the B method or Event-B. It combines a tool for managing and verifying software projects (Atelier B and a model checker/animator (ProB so that the verification conditions generated in the former are evaluated with the latter. In our experiments, the two main verification strategies (manual and automatic showed significant improvement as ProB's evaluator proves complementary to Atelier B built-in provers. We conducted experiments with the B model of a micro-controller instruction set; several verification conditions, that we were not able to discharge automatically or manually with AtelierB's provers, were automatically verified using BEval.

  12. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  13. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  14. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  15. Analysing the Logic and Rigor in the Process of Verification of HACCP Plan%论验证HACCP计划过程中的逻辑性和严密性

    Institute of Scientific and Technical Information of China (English)

    秦红

    2013-01-01

    HACCP体系是一种系统性的食品安全预防控制体系,现已经越来越受到出口食品加工企业的重视,在许多企业中广泛运用并且质量提升效果明显。但是随着HACCP的不断发展,无论国内外官方有无要求,正有越来越多的企业申请HACCP验证或认证。现在企业制订HACCP计划,基本是采用美国国家水产品HACCP培训和教育联盟编写的“HACCP教程”给出的模式,但这种模式的要求是非常严格的。笔者将从逻辑性和严密性两个方面论证此模式下的HACCP计划的验证。旨在帮助指导审核人员完善HACCP计划验证过程,确保HACCP计划的有效实施。%HACCP system is a systematic preventive food safety control system, nowadays,the export food processing enterprises pay more and more attention to it , which has been widely used in many enterprises and the quality improvement effect is obvious. But with the development of HACCP , more and more enterprises apply for HACCP verification or certification. Basically , the enterprise use the model of the"HACCP Guidance"writing by the SHA, to make its HACCP plan, however. This model's requirement is very strict. I will analyse the logic and rigor in the process of verification of HACCP plan. In order to help guiding the personnel perfecting the HACCP plan Verification process, ensure the effective implementation of the HACCP plan.

  16. The use of measurement uncertainty in nuclear materials accuracy and verification

    International Nuclear Information System (INIS)

    Alique, O.; Vaccaro, S.; Svedkauskaite, J.

    2015-01-01

    EURATOM nuclear safeguards are based on the nuclear operators’ accounting for and declaring of the amounts of nuclear materials in their possession, as well as on the European Commission verifying the correctness and completeness of such declarations by means of conformity assessment practices. Both the accountancy and the verification processes comprise the measurements of amounts and characteristics of nuclear materials. The uncertainties associated to these measurements play an important role in the reliability of the results of nuclear material accountancy and verification. The document “JCGM 100:2008 Evaluation of measurement data – Guide to the expression of uncertainty in measurement” - issued jointly by the International Bureau of Weights and Measures (BIPM) and international organisations for metrology, standardisation and accreditation in chemistry, physics and electro technology - describes a universal, internally consistent, transparent and applicable method for the evaluation and expression of uncertainty in measurements. This paper discusses different processes of nuclear materials accountancy and verification where measurement uncertainty plays a significant role. It also suggests the way measurement uncertainty could be used to enhance the reliability of the results of the nuclear materials accountancy and verification processes.

  17. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    NARCIS (Netherlands)

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting

  18. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  19. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    International Nuclear Information System (INIS)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom

    2011-01-01

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  20. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2011-05-15

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  1. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Directory of Open Access Journals (Sweden)

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  2. Multi-dimensional boron transport modeling in subchannel approach: Part I. Model selection, implementation and verification of COBRA-TF boron tracking model

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, Ozkan Emre, E-mail: ozdemir@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Avramova, Maria N., E-mail: mna109@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Sato, Kenya, E-mail: kenya_sato@mhi.co.jp [Mitsubishi Heavy Industries (MHI), Kobe (Japan)

    2014-10-15

    Highlights: ► Implementation of multidimensional boron transport model in a subchannel approach. ► Studies on cross flow mechanism, heat transfer and lateral pressure drop effects. ► Verification of the implemented model via code-to-code comparison with CFD code. - Abstract: The risk of reflux condensation especially during a Small Break Loss Of Coolant Accident (SB-LOCA) and the complications of tracking the boron concentration experimentally inside the primary coolant system have stimulated and subsequently have been a focus of many computational studies on boron tracking simulations in nuclear reactors. This paper presents the development and implementation of a multidimensional boron transport model with Modified Godunov Scheme within a thermal-hydraulic code based on a subchannel approach. The cross flow mechanism in multiple-subchannel rod bundle geometry as well as the heat transfer and lateral pressure drop effects are considered in the performed studies on simulations of deboration and boration cases. The Pennsylvania State University (PSU) version of the COBRA-TF (CTF) code was chosen for the implementation of three different boron tracking models: First Order Accurate Upwind Difference Scheme, Second Order Accurate Godunov Scheme, and Modified Godunov Scheme. Based on the performed nodalization sensitivity studies, the Modified Godunov Scheme approach with a physical diffusion term was determined to provide the best solution in terms of precision and accuracy. As a part of the verification and validation activities, a code-to-code comparison was carried out with the STAR-CD computational fluid dynamics (CFD) code and presented here. The objective of this study was two-fold: (1) to verify the accuracy of the newly developed CTF boron tracking model against CFD calculations; and (2) to investigate its numerical advantages as compared to other thermal-hydraulics codes.

  3. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  4. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  5. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  6. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  7. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  8. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  9. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  10. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  11. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  12. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  13. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  14. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  15. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  16. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  17. Final Report for 'Verification and Validation of Radiation Hydrodynamics for Astrophysical Applications'

    International Nuclear Information System (INIS)

    Zingale, M.; Howell, L.H.

    2010-01-01

    The motivation for this work is to gain experience in the methodology of verification and validation (V and V) of astrophysical radiation hydrodynamics codes. In the first period of this work, we focused on building the infrastructure to test a single astrophysical application code, Castro, developed in collaboration between Lawrence Livermore National Laboratory (LLNL) and Lawrence Berkeley Laboratory (LBL). We delivered several hydrodynamic test problems, in the form of coded initial conditions and documentation for verification, routines to perform data analysis, and a generalized regression test suite to allow for continued automated testing. Astrophysical simulation codes aim to model phenomena that elude direct experimentation. Our only direct information about these systems comes from what we observe, and may be transient. Simulation can help further our understanding by allowing virtual experimentation of these systems. However, to have confidence in our simulations requires us to have confidence in the tools we use. Verification and Validation is a process by which we work to build confidence that a simulation code is accurately representing reality. V and V is a multistep process, and is never really complete. Once a single test problem is working as desired (i.e. that problem is verified), one wants to ensure that subsequent code changes do not break that test. At the same time, one must also search for new verification problems that test the code in a new way. It can be rather tedious to manually retest each of the problems, so before going too far with V and V, it is desirable to have an automated test suite. Our project aims to provide these basic tools for astrophysical radiation hydrodynamics codes.

  18. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  19. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  20. Verification of Liveness Properties on Closed Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Andersen, Mathias; Larsen, Heine G.; Srba, Jiri

    2012-01-01

    Verification of closed timed models by explicit state-space exploration methods is an alternative to the wide-spread symbolic techniques based on difference bound matrices (DBMs). A few experiments found in the literature confirm that for the reachability analysis of timed automata explicit...... techniques can compete with DBM-based algorithms, at least for situations where the constants used in the models are relatively small. To the best of our knowledge, the explicit methods have not yet been employed in the verification of liveness properties in Petri net models extended with time. We present...

  1. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  2. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  3. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  4. Studies on plant dynamics of sodium-cooled fast breeder reactors - verification of a plant model

    International Nuclear Information System (INIS)

    Schubert, B.

    1988-01-01

    For the analysis of sodium-cooled FBR safety and dynamics theoretical models are used, which have to be verified. In this report the verification of the plant model SSC-L is conducted by the comparison of calculated data with measurements of the experimental reactors KNK II and RAPSODIE. For this the plant model is extended and adapted. In general only small differences between calculated and measured data are recognized. The results are used to improve and complete the plant model. The extensions of the plant model applicability are used for the calculation of a loss of heat sink transient with reactor scram, considering pipes as passive heat sinks. (orig./HP) With 69 figs., 10 tabs [de

  5. Statistical Modeling, Simulation, and Experimental Verification of Wideband Indoor Mobile Radio Channels

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ma

    2018-01-01

    Full Text Available This paper focuses on the modeling, simulation, and experimental verification of wideband single-input single-output (SISO mobile fading channels for indoor propagation environments. The indoor reference channel model is derived from a geometrical rectangle scattering model, which consists of an infinite number of scatterers. It is assumed that the scatterers are exponentially distributed over the two-dimensional (2D horizontal plane of a rectangular room. Analytical expressions are derived for the probability density function (PDF of the angle of arrival (AOA, the PDF of the propagation path length, the power delay profile (PDP, and the frequency correlation function (FCF. An efficient sum-of-cisoids (SOC channel simulator is derived from the nonrealizable reference model by employing the SOC principle. It is shown that the SOC channel simulator approximates closely the reference model with respect to the FCF. The SOC channel simulator enables the performance evaluation of wideband indoor wireless communication systems with reduced realization expenditure. Moreover, the rationality and usefulness of the derived indoor channel model is confirmed by various measurements at 2.4, 5, and 60 GHz.

  6. Multibody dynamical modeling for spacecraft docking process with spring-damper buffering device: A new validation approach

    Science.gov (United States)

    Daneshjou, Kamran; Alibakhshi, Reza

    2018-01-01

    In the current manuscript, the process of spacecraft docking, as one of the main risky operations in an on-orbit servicing mission, is modeled based on unconstrained multibody dynamics. The spring-damper buffering device is utilized here in the docking probe-cone system for micro-satellites. Owing to the impact occurs inevitably during docking process and the motion characteristics of multibody systems are remarkably affected by this phenomenon, a continuous contact force model needs to be considered. Spring-damper buffering device, keeping the spacecraft stable in an orbit when impact occurs, connects a base (cylinder) inserted in the chaser satellite and the end of docking probe. Furthermore, by considering a revolute joint equipped with torsional shock absorber, between base and chaser satellite, the docking probe can experience both translational and rotational motions simultaneously. Although spacecraft docking process accompanied by the buffering mechanisms may be modeled by constrained multibody dynamics, this paper deals with a simple and efficient formulation to eliminate the surplus generalized coordinates and solve the impact docking problem based on unconstrained Lagrangian mechanics. By an example problem, first, model verification is accomplished by comparing the computed results with those recently reported in the literature. Second, according to a new alternative validation approach, which is based on constrained multibody problem, the accuracy of presented model can be also evaluated. This proposed verification approach can be applied to indirectly solve the constrained multibody problems by minimum required effort. The time history of impact force, the influence of system flexibility and physical interaction between shock absorber and penetration depth caused by impact are the issues followed in this paper. Third, the MATLAB/SIMULINK multibody dynamic analysis software will be applied to build impact docking model to validate computed results and

  7. Methods to model-check parallel systems software

    International Nuclear Information System (INIS)

    Matlin, O. S.; McCune, W.; Lusk, E.

    2003-01-01

    We report on an effort to develop methodologies for formal verification of parts of the Multi-Purpose Daemon (MPD) parallel process management system. MPD is a distributed collection of communicating processes. While the individual components of the collection execute simple algorithms, their interaction leads to unexpected errors that are difficult to uncover by conventional means. Two verification approaches are discussed here: the standard model checking approach using the software model checker SPIN and the nonstandard use of a general-purpose first-order resolution-style theorem prover OTTER to conduct the traditional state space exploration. We compare modeling methodology and analyze performance and scalability of the two methods with respect to verification of MPD

  8. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  9. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  10. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  11. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Science.gov (United States)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  12. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  13. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  14. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  15. An Integrated Approach to Conversion, Verification, Validation and Integrity of AFRL Generic Engine Model and Simulation (Postprint)

    Science.gov (United States)

    2007-02-01

    and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V

  16. Electric and hybrid vehicle self-certification and verification procedures: Market Demonstration Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-01

    The process by which a manufacturer of an electric or hybrid vehicle certifies that his vehicle meets the DOE Performance Standards for Demonstration is described. Such certification is required for any vehicles to be purchased under the Market Demonstration Program. It also explains the verification testing process followed by DOE for testing to verify compliance. Finally, the document outlines manufacturer responsibilities and presents procedures for recertification of vehicles that have failed verification testing.

  17. Self-Verification of Ability through Biased Performance Memory.

    Science.gov (United States)

    Karabenick, Stuart A.; LeBlanc, Daniel

    Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…

  18. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  19. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  20. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  1. CFD modeling and experimental verification of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler operating at 90-170 Hz

    Science.gov (United States)

    Zhao, Yibo; Yu, Guorui; Tan, Jun; Mao, Xiaochen; Li, Jiaqi; Zha, Rui; Li, Ning; Dang, Haizheng

    2018-03-01

    This paper presents the CFD modeling and experimental verifications of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler (MCSPTC) operating at 90-170 Hz. It uses neither double-inlet nor multi-bypass while the inertance tube with a gas reservoir becomes the only phase-shifter. The effects of the frequency on flow and heat transfer processes in the pulse tube are investigated, which indicates that a low enough frequency would lead to a strong mixing between warm and cold fluids, thereby significantly deteriorating the cooling performance, whereas a high enough frequency would produce the downward sloping streams flowing from the warm end to the axis and almost puncturing the gas displacer from the warm end, thereby creating larger temperature gradients in radial directions and thus undermining the cooling performance. The influence of the pulse tube length on the temperature and velocity when the frequencies are much higher than the optimal one are also discussed. A MCSPTC with an overall mass of 1.1 kg is worked out and tested. With an input electric power of 59 W and operating at 144 Hz, it achieves a no-load temperature of 61.4 K and a cooling capacity of 1.0 W at 77 K. The changing tendencies of tested results are in good agreement with the simulations. The above studies will help to thoroughly understand the underlying mechanism of the inertance MCSPTC operating at very high frequencies.

  2. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  3. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  4. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  5. Time-space modal logic for verification of bit-slice circuits

    Science.gov (United States)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  6. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  7. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    Science.gov (United States)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  8. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  9. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  10. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  11. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  12. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  13. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  14. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Brantley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  15. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  16. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  17. Verification and Validation of the Tritium Transport Code TMAP7

    International Nuclear Information System (INIS)

    Longhurst, Glen R.; Ambrosek, James

    2005-01-01

    The TMAP code has been upgraded to version 7, which includes radioactive decay along with many features implemented in prior versions. Pursuant to acceptance and release for distribution, the code was exercised in a variety of problem types to demonstrate that it provides results in agreement with theoretical results for cases where those are available. It has also been used to model certain experimental results. In this paper, the capabilities of the TMAP7 code are demonstrated by presenting some of the results from the verification and validation process

  18. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  19. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  20. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  1. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  2. Accelerating SystemVerilog UVM Based VIP to Improve Methodology for Verification of Image Signal Processing Designs Using HW Emulator

    OpenAIRE

    Jain, Abhishek; Gupta, Piyush Kumar; Gupta, Dr. Hima; Dhar, Sachish

    2014-01-01

    In this paper we present the development of Acceleratable UVCs from standard UVCs in SystemVerilog and their usage in UVM based Verification Environment of Image Signal Processing designs to increase run time performance. This paper covers development of Acceleratable UVCs from standard UVCs for internal control and data buses of ST imaging group by partitioning of transaction-level components and cycle-accurate signal-level components between the software simulator and hardware accelerator r...

  3. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  4. Remaining Sites Verification Package for the 100-F-26:12, 1.8-m (72-in.) Main Process Sewer Pipeline. Attachment to Waste Site Reclassification Form 2007-034

    International Nuclear Information System (INIS)

    Capron, J.M.

    2008-01-01

    The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River

  5. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  6. Response to "Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses"
.

    Science.gov (United States)

    Zhu, Ling-Ling; Lv, Na; Zhou, Quan

    2016-12-01

    We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.

  7. Lessons learned in the verification, validation and application of a coupled heat and fluid flow code

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1986-01-01

    A summary is given of the authors recent studies in the verification, validation and application of a coupled heat and fluid flow code. Verification has been done against eight analytic and semi-analytic solutions. These solutions include those involving thermal buoyancy flow and fracture flow. Comprehensive field validation studies over a period of four years are discussed. The studies are divided into three stages: (1) history matching, (2) double-blind prediction and confirmation, (3) design optimization. At each stage, parameter sensitivity studies are performed. To study the applications of mathematical models, a problem proposed by the International Energy Agency (IEA) is solved using this verified and validated numerical model as well as two simpler models. One of the simpler models is a semi-analytic method assuming the uncoupling of the heat and fluid flow processes. The other is a graphical method based on a large number of approximations. Variations are added to the basic IEA problem to point out the limits of ranges of applications of each model. A number of lessons are learned from the above investigations. These are listed and discussed

  8. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  9. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  10. North Korea's nuclear weapons program:verification priorities and new challenges.

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Duk-ho (Korean Consulate General in New York)

    2003-12-01

    A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.

  11. Virtual optical network provisioning with unified service logic processing model for software-defined multidomain optical networks

    Science.gov (United States)

    Zhao, Yongli; Li, Shikun; Song, Yinan; Sun, Ji; Zhang, Jie

    2015-12-01

    Hierarchical control architecture is designed for software-defined multidomain optical networks (SD-MDONs), and a unified service logic processing model (USLPM) is first proposed for various applications. USLPM-based virtual optical network (VON) provisioning process is designed, and two VON mapping algorithms are proposed: random node selection and per controller computation (RNS&PCC) and balanced node selection and hierarchical controller computation (BNS&HCC). Then an SD-MDON testbed is built with OpenFlow extension in order to support optical transport equipment. Finally, VON provisioning service is experimentally demonstrated on the testbed along with performance verification.

  12. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  13. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  14. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    Science.gov (United States)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  15. Study and characterization of arrays of detectors for dosimetric verification of radiotherapy, analysis of business solutions

    International Nuclear Information System (INIS)

    Gago Arias, A.; Brualla Gonzalez, L.; Gomez Rodriguez, F.; Gonzalez Castano, D. M.; Pardo Montero, J.; Luna Vega, V.; Mosquera Sueiro, J.; Sanchez Garcia, M.

    2011-01-01

    This paper presents a comparative study of the detector arrays developed by different business houses to the demand for devices that speed up the verification process. Will analyze the effect of spatial response of individual detectors in the measurement of dose distributions, modeling the same and analyzing the ability of the arrays to detect variations in a treatment yield.

  16. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  17. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  18. Verification of SuperMC with ITER C-Lite neutronic model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Shu [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui, 230027 (China); Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Yu, Shengpeng [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); He, Peng, E-mail: peng.he@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China)

    2016-12-15

    Highlights: • Verification of the SuperMC Monte Carlo transport code with ITER C-Lite model. • The modeling of the ITER C-Lite model using the latest SuperMC/MCAM. • All the calculated quantities are consistent with MCNP well. • Efficient variance reduction methods are adopted to accelerate the calculation. - Abstract: In pursit of accurate and high fidelity simulation, the reference model of ITER is becoming more and more detailed and complicated. Due to the complexity in geometry and the thick shielding of the reference model, the accurate modeling and precise simulaion of fusion neutronics are very challenging. Facing these difficulties, SuperMC, the Monte Carlo simulation software system developed by the FDS Team, has optimized its CAD interface for the automatic converting of more complicated models and increased its calculation efficiency with advanced variance reduction methods To demonstrate its capabilites of automatic modeling, neutron/photon coupled simulation and visual analysis for the ITER facility, numerical benchmarks using the ITER C-Lite neutronic model were performed. The nuclear heating in divertor and inboard toroidal field (TF) coils and a global neutron flux map were evaluated. All the calculated nuclear heating is compared with the results of the MCNP code and good consistencies between the two codes is shown. Using the global variance reduction methods in SuperMC, the average speed-up is 292 times for the calculation of inboard TF coils nuclear heating, and 91 times for the calculation of global flux map, compared with the analog run. These tests have shown that SuperMC is suitable for the design and analysis of ITER facility.

  19. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  20. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  1. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  2. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  3. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  4. Modelling Embedded Systems by Non-Monotonic Refinement

    NARCIS (Netherlands)

    Mader, Angelika H.; Marincic, J.; Wupper, H.

    2008-01-01

    This paper addresses the process of modelling embedded sys- tems for formal verification. We propose a modelling process built on non-monotonic refinement and a number of guidelines. The outcome of the modelling process is a model, together with a correctness argument that justifies our modelling

  5. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  6. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  7. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  8. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  9. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J; Trunov, N; Tschekin, I [OKB Gidropress (Russian Federation); Luther, W [GRS Garching (Germany); Spolitak, S [RNC-KI (Russian Federation)

    1996-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  10. SoC Design Approach Using Convertibility Verification

    Directory of Open Access Journals (Sweden)

    Basu Samik

    2008-01-01

    Full Text Available Abstract Compositional design of systems on chip from preverified components helps to achieve shorter design cycles and time to market. However, the design process is affected by the issue of protocol mismatches, where two components fail to communicate with each other due to protocol differences. Convertibility verification, which involves the automatic generation of a converter to facilitate communication between two mismatched components, is a collection of techniques to address protocol mismatches. We present an approach to convertibility verification using module checking. We use Kripke structures to represent protocols and the temporal logic to describe desired system behavior. A tableau-based converter generation algorithm is presented which is shown to be sound and complete. We have developed a prototype implementation of the proposed algorithm and have used it to verify that it can handle many classical protocol mismatch problems along with SoC problems. The initial idea for -based convertibility verification was presented at SLA++P '07 as presented in the work by Roopak Sinha et al. 2008.

  11. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used...... to specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  12. Automated Image Acquisition System for the Verification of Copper-Brass Seal Images

    International Nuclear Information System (INIS)

    Stringa, E.; Bergonzi, C.; Littmann, F.; ); Marszalek, Y.; Tempesta, S.; )

    2015-01-01

    This paper describes a system for the verification of copper-brass seals realized by JRC according to DG ENER requirements. DG ENER processes about 20,000 metal seals per year. The verification of metal seals consists in visually checking the identity of a removed seal. The identity of a copper-brass seal is defined by a random stain pattern realized by the seal producer together with random scratches engraved when the seals are initialized ('seal production'). In order to verify that the seal returned from the field is the expected one its pattern is compared with an image taken during seal production. Formerly, seal initialization and verification were very heavy tasks as seal pictures were acquired with a camera one by one both in the initialization and verification stages. During the initialization the Nuclear Safeguards technicians had to place one by one new seals under a camera and acquire the related reference images. During the verification, the technician had to take used seals and place them one by one under a camera to take new pictures. The new images were presented to the technicians without any preprocessing and the technicians had to recognize the seal. The new station described in this paper has an automated image acquisition system allowing to easily process seals in batches of 100 seals. To simplify the verification, a software automatically centres and rotates the newly acquired seal image in order to perfectly overlap with the reference image acquired during the production phase. The new system significantly speeds up seal production and helps particularly with the demanding task of seal verification. As a large part of the seals is dealt with by a joint Euratom-IAEA team, the IAEA directly profits from this development. The new tool has been in routine use since mid 2013. (author)

  13. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  14. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  15. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  16. Mathematical description for the measurement and verification of energy efficiency improvement

    International Nuclear Information System (INIS)

    Xia, Xiaohua; Zhang, Jiangfeng

    2013-01-01

    Highlights: • A mathematical model for the measurement and verification problem is established. • Criteria to choose the four measurement and verification options are given. • Optimal measurement and verification plan is defined. • Calculus of variations and optimal control can be further applied. - Abstract: Insufficient energy supply is a problem faced by many countries, and energy efficiency improvement is identified as the quickest and most effective solution to this problem. Many energy efficiency projects are therefore initiated to reach various energy saving targets. These energy saving targets need to be measured and verified, and in many countries such a measurement and verification (M and V) activity is guided by the International Performance Measurement and Verification Protocol (IPMVP). However, M and V is widely regarded as an inaccurate science: an engineering practice relying heavily on professional judgement. This paper presents a mathematical description of the energy efficiency M and V problem and thus casts into a scientific framework the basic M and V concepts, propositions, techniques and methodologies. For this purpose, a general description of energy system modeling is provided to facilitate the discussion, strict mathematical definitions for baseline and baseline adjustment are given, and the M and V plan development is formulated as an M and V modeling problem. An optimal M and V plan is therefore obtained through solving a calculus of variation, or equivalently, an optimal control problem. This approach provides a fruitful source of research problems by which optimal M and V plans under various practical constraints can be determined. With the aid of linear control system models, this mathematical description also provides sufficient conditions for M and V practitioners to determine which one of the four M and V options in IPMVP should be used in a practical M and V project

  17. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    Energy Technology Data Exchange (ETDEWEB)

    Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Tachibana, R; Uchida, Y [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Sugawara, Y; Kotabe, K [National Center for Global Health and Medicine, Shinjuku, Tokyo (Japan); Kamima, T [Cancer Institute Hospital Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Ishibashi, S [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  18. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Tachibana, R; Uchida, Y; Yamashita, M; Shimizu, H; Sugawara, Y; Kotabe, K; Kamima, T; Takahashi, R; Ishibashi, S; Tachibana, H

    2016-01-01

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  19. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  20. Verification of Ensemble Forecasts for the New York City Operations Support Tool

    Science.gov (United States)

    Day, G.; Schaake, J. C.; Thiemann, M.; Draijer, S.; Wang, L.

    2012-12-01

    The New York City water supply system operated by the Department of Environmental Protection (DEP) serves nine million people. It covers 2,000 square miles of portions of the Catskill, Delaware, and Croton watersheds, and it includes nineteen reservoirs and three controlled lakes. DEP is developing an Operations Support Tool (OST) to support its water supply operations and planning activities. OST includes historical and real-time data, a model of the water supply system complete with operating rules, and lake water quality models developed to evaluate alternatives for managing turbidity in the New York City Catskill reservoirs. OST will enable DEP to manage turbidity in its unfiltered system while satisfying its primary objective of meeting the City's water supply needs, in addition to considering secondary objectives of maintaining ecological flows, supporting fishery and recreation releases, and mitigating downstream flood peaks. The current version of OST relies on statistical forecasts of flows in the system based on recent observed flows. To improve short-term decision making, plans are being made to transition to National Weather Service (NWS) ensemble forecasts based on hydrologic models that account for short-term weather forecast skill, longer-term climate information, as well as the hydrologic state of the watersheds and recent observed flows. To ensure that the ensemble forecasts are unbiased and that the ensemble spread reflects the actual uncertainty of the forecasts, a statistical model has been developed to post-process the NWS ensemble forecasts to account for hydrologic model error as well as any inherent bias and uncertainty in initial model states, meteorological data and forecasts. The post-processor is designed to produce adjusted ensemble forecasts that are consistent with the DEP historical flow sequences that were used to develop the system operating rules. A set of historical hindcasts that is representative of the real-time ensemble

  1. Artificial neural network modelling approach for a biomass gasification process in fixed bed gasifiers

    International Nuclear Information System (INIS)

    Mikulandrić, Robert; Lončar, Dražen; Böhning, Dorith; Böhme, Rene; Beckmann, Michael

    2014-01-01

    Highlights: • 2 Different equilibrium models are developed and their performance is analysed. • Neural network prediction models for 2 different fixed bed gasifier types are developed. • The influence of different input parameters on neural network model performance is analysed. • Methodology for neural network model development for different gasifier types is described. • Neural network models are verified for various operating conditions based on measured data. - Abstract: The number of the small and middle-scale biomass gasification combined heat and power plants as well as syngas production plants has been significantly increased in the last decade mostly due to extensive incentives. However, existing issues regarding syngas quality, process efficiency, emissions and environmental standards are preventing biomass gasification technology to become more economically viable. To encounter these issues, special attention is given to the development of mathematical models which can be used for a process analysis or plant control purposes. The presented paper analyses possibilities of neural networks to predict process parameters with high speed and accuracy. After a related literature review and measurement data analysis, different modelling approaches for the process parameter prediction that can be used for an on-line process control were developed and their performance were analysed. Neural network models showed good capability to predict biomass gasification process parameters with reasonable accuracy and speed. Measurement data for the model development, verification and performance analysis were derived from biomass gasification plant operated by Technical University Dresden

  2. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  3. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  4. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  5. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  6. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  7. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    deterministic Markov decision processes from data by actively guiding the selection of input actions. The algorithm is empirically analyzed by learning system models of slot machines, and it is demonstrated that the proposed active learning procedure can significantly reduce the amount of data required...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...... of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  8. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?

    Science.gov (United States)

    2010-10-01

    ... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...

  9. SIMMER-III code-verification. Phase 1

    International Nuclear Information System (INIS)

    Maschek, W.

    1996-05-01

    SIMMER-III is a computer code to investigate core disruptive accidents in liquid metal fast reactors but should also be used to investigate safety related problems in other types of advanced reactors. The code is developed by PNC with cooperation of the European partners FZK, CEA and AEA-T. SIMMER-III is a two-dimensional, three-velocity-field, multiphase, multicomponent, Eulerian, fluid-dynamics code coupled with a space-, time-, and energy-dependent neutron dynamics model. In order to model complex flow situations in a postulated disrupting core, mass and energy conservation equations are solved for 27 density components and 16 energy components, respectively. Three velocity fields (two liquid and one vapor) are modeled to simulate the relative motion of different fluid components. An additional static field takes into account the structures available in a reactor (pins, hexans, vessel structures, internal structures etc.). The neutronics is based on the discrete ordinate method (S N method) coupled into a quasistatic dynamic model. The code assessment and verification of the fluid dynamic/thermohydraulic parts of the code is performed in several steps in a joint effort of all partners. The results of the FZK contributions to the first assessment and verification phase is reported. (orig.) [de

  10. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  11. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  12. Cubic Bezier Curve Approach for Automated Offline Signature Verification with Intrusion Identification

    Directory of Open Access Journals (Sweden)

    Arun Vijayaragavan

    2014-01-01

    Full Text Available Authentication is a process of identifying person’s rights over a system. Many authentication types are used in various systems, wherein biometrics authentication systems are of a special concern. Signature verification is a basic biometric authentication technique used widely. The signature matching algorithm uses image correlation and graph matching technique which provides false rejection or acceptance. We proposed a model to compare knowledge from signature. Intrusion in the signature repository system results in copy of the signature that leads to false acceptance. Our approach uses a Bezier curve algorithm to identify the curve points and uses the behaviors of the signature for verification. An analyzing mobile agent is used to identify the input signature parameters and compare them with reference signature repository. It identifies duplication of signature over intrusion and rejects it. Experiments are conducted on a database with thousands of signature images from various sources and the results are favorable.

  13. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  14. Verification on spray simulation of a pintle injector for liquid rocket engine

    Science.gov (United States)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  15. DIMENSIONAL VERIFICATION AND QUALITY CONTROL OF IMPLANTS PRODUCED BY ADDITIVE MANUFACTURING

    Directory of Open Access Journals (Sweden)

    Teodor Toth

    2015-07-01

    Full Text Available Purpose: Development of computer technology and alternative manufacturing methods in form of additive manufacturing leads to the manufacture of products with complex shapes. In the field of medicine they include, inter alia, custom-made implants manufactured for a particular patient, such as cranial implants, maxillofacial implants, etc. With regard to the fact that such implants are inserted into a patient’s body, it is necessary to perform the verification, including the shape and dimensional verification. The article deals with the application of the industrial computer tomography within the process of inspection and verification of selected custom-made implant types.Methodology/Approach: The Department of Biomedical Engineering and Measurement performs the verification of medicinal products manufactured by the additive manufacturing technologies from the Ti-6Al-4V (Grade 5 titanium alloy, using the coordinate measuring machine Carl Zeiss Contura G2 and the industrial computed tomography machine Carl Zeiss Metrotom 1500. These equipment fulfil the requirements for the identification and evaluation of dimensions of both, the external and the internal structures. Findings: The article presents the possibilities of the computed tomography utilisation in the inspection of individual implant manufacture using the additive manufacturing technologies. The results indicate that with the adjustment of appropriate input parameters (alignment, this technology is appropriate for the analysis of shape deviations, when compared with the CAD model.Research Limitation/implication: With the increasing distance of measured object from X-ray source, the machine’s resolution function decreases. Decreasing of resolution has a minor impact on the measured dimensions (relatively high tolerances, but has a significant impact on the evaluation of porosity and inclusions. Originality/Value of paper: Currently, the verification of a manufactured implant  can be

  16. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  17. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  19. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  20. Quantile Acoustic Vectors vs. MFCC Applied to Speaker Verification

    Directory of Open Access Journals (Sweden)

    Mayorga-Ortiz Pedro

    2014-02-01

    Full Text Available In this paper we describe speaker and command recognition related experiments, through quantile vectors and Gaussian Mixture Modelling (GMM. Over the past several years GMM and MFCC have become two of the dominant approaches for modelling speaker and speech recognition applications. However, memory and computational costs are important drawbacks, because autonomous systems suffer processing and power consumption constraints; thus, having a good trade-off between accuracy and computational requirements is mandatory. We decided to explore another approach (quantile vectors in several tasks and a comparison with MFCC was made. Quantile acoustic vectors are proposed for speaker verification and command recognition tasks and the results showed very good recognition efficiency. This method offered a good trade-off between computation times, characteristics vector complexity and overall achieved efficiency.

  1. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  2. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  3. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  4. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  5. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  6. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  7. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    Science.gov (United States)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  8. Radiochromic film in the dosimetric verification of intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Zhou Yingjuan; Huang Shaomin; Deng Xiaowu

    2007-01-01

    Objective: Objective To investigate the dose-response behavior of a new type of radio- chromic film( GAFCHROMIC EBT) and explore the clinical application means and precision of dosage measurement, which can be applied for: (1) plan-specific dosimetric verification for intensity modulated radiation therapy, (2) to simplify the process of quality assurance using traditional radiographic film dosimetric system and (3) to establish a more reliable, more efficient dosimetric verification system for intensity modulated radiation therapy. Methods: (1) The step wedge calibration technique was used to calibrate EBT radiochromic film and EDR2 radiographic film. The dose characteristics, the measurement consistency and the quality assurance process between the two methods were compared. (2) The in-phantom dose-measurement based verification technique has been adopted. Respectively, EBT film and EDR2 film were used to measure the same dose plane of IMRT treatment plans. The results of the dose map, dose profiles and iso- dose curves were compared with those calculated by CORVUS treatment planning system to evaluate the function of EBT film for dosimetric verification for intensity modulated radiation therapy. Results: (1) Over the external beam dosimetric range of 0-500 cGy, EBT/VXR-16 and EDR2/VXR-16 film dosimetric system had the same measurement consistency with the measurement variability less then 0.70%. The mean measurement variability of these two systems was 0.37% and 0.68%, respectively. The former proved to be the superior modality at measurement consistency, reliability, and efficiency over dynamic clinical dose range , furthermore, its quality assurance showed less process than the latter. (2) The dosimetric verification of IMRT plane measured with EBT film was quite similar to that with EDR2 film which was processed under strict quality control. In a plane of the phantom, the maximal dose deviation off axis between EBT film measurement and the TPS calculation was

  9. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  10. Modeling the Process of Purchase Payment as a Constituent of Information Security in E-commerce

    Directory of Open Access Journals (Sweden)

    Volodymyr Skitsko

    2016-01-01

    Full Text Available A mathematical model of the process of payment for purchases in an online store has been presented. The model belongs to the class of semi-open queueing networks with four phases of exponential servers and Poisson arrivals. The authors describe in detail the derivation of the equations describing the system. Analytic expressions are derived on the basis of the proposed model for the average number of online store customers who have already paid for goods. Practical implementation of the model allows us to determine the number of clients who have added goods to their cart, but have not yet passed through the payment verification system, and thus determine the stream of real customers of the online store. (original abstract

  11. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  12. VERIFICATION OF THE SENTINEL-4 FOCAL PLANE SUBSYSTEM

    Directory of Open Access Journals (Sweden)

    C. Williges

    2017-05-01

    Full Text Available The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs, one for the UV-VIS spectral range (305 nm … 500 nm, the second for NIR (750 nm … 775 nm. In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM which will also be used for the upcoming Flight Model (FM verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  13. Verification of the Sentinel-4 focal plane subsystem

    Science.gov (United States)

    Williges, Christian; Uhlig, Mathias; Hilbert, Stefan; Rossmann, Hannes; Buchwinkler, Kevin; Babben, Steffen; Sebastian, Ilse; Hohn, Rüdiger; Reulke, Ralf

    2017-09-01

    The Sentinel-4 payload is a multi-spectral camera system, designed to monitor atmospheric conditions over Europe from a geostationary orbit. The German Aerospace Center, DLR Berlin, conducted the verification campaign of the Focal Plane Subsystem (FPS) during the second half of 2016. The FPS consists, of two Focal Plane Assemblies (FPAs), two Front End Electronics (FEEs), one Front End Support Electronic (FSE) and one Instrument Control Unit (ICU). The FPAs are designed for two spectral ranges: UV-VIS (305 nm - 500 nm) and NIR (750 nm - 775 nm). In this publication, we will present in detail the set-up of the verification campaign of the Sentinel-4 Qualification Model (QM). This set up will also be used for the upcoming Flight Model (FM) verification, planned for early 2018. The FPAs have to be operated at 215 K +/- 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. The test campaign consists mainly of radiometric tests. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Selected test analyses and results will be presented.

  14. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  15. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    Science.gov (United States)

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  16. Verification of experimental modal modeling using HDR (Heissdampfreaktor) dynamic test data

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1983-01-01

    Experimental modal modeling involves the determination of the modal parameters of the model of a structure from recorded input-output data from dynamic tests. Though commercial modal analysis algorithms are being widely used in many industries their ability to identify a set of reliable modal parameters of an as-built nuclear power plant structure has not been systematically verified. This paper describes the effort to verify MODAL-PLUS, a widely used modal analysis code, using recorded data from the dynamic tests performed on the reactor building of the Heissdampfreaktor, situated near Frankfurt, Federal Republic of Germany. In the series of dynamic tests on HDR in 1979, the reactor building was subjected to forced vibrations from different types and levels of dynamic excitations. Two sets of HDR containment building input-output data were chosen for MODAL-PLUS analyses. To reduce the influence of nonlinear behavior on the results, these sets were chosen so that the levels of excitation are relatively low and about the same in the two sets. The attempted verification was only partially successful in that only one modal model, with a limited range of validity, could be synthesized and in that the goodness of fit could be verified only in this limited range

  17. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  18. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  19. Verification of seiching processes in a large and deep lake (Trichonis, Greece

    Directory of Open Access Journals (Sweden)

    I. ZACHARIAS

    2000-06-01

    Experimental verifications are from recordings taken during spring. Visual observations of the record permit identification of the five lowest order modes, including inter station phase shift. Power spectral analysis of two time series and interstation phase difference and coherence spectra allow the identification of the same five modes. Agreement between the theoretically predicted and the experimentally determined periods was excellent for most of the calculated modes.

  20. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  1. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  2. Software reliability evaluation of digital plant protection system development process using V and V

    International Nuclear Information System (INIS)

    Lee, Na Young; Hwang, Il Soon; Seong, Seung Hwan; Oh, Seung Rok

    2001-01-01

    In the nuclear power industry, digital technology has been introduced recently for the Instrumentation and Control (I and C) of reactor systems. For its application to the safety critical system such as Reactor Protection System(RPS), a reliability assessment is indispensable. Unlike traditional reliability models, software reliability is hard to evaluate, and should be evaluated throughout development lifecycle. In the development process of Digital Plant Protection System(DPPS), the concept of verification and validation (V and V) was introduced to assure the quality of the product. Also, test should be performed to assure the reliability. Verification procedure with model checking is relatively well defined, however, test is labor intensive and not well organized. In this paper, we developed the methodological process of combining the verification with validation test case generation. For this, we used PVS for the table specification and for the theorem proving. As a result, we could not only save time to design test case but also get more effective and complete verification related test case set. Add to this, we could extract some meaningful factors useful for the reliability evaluation both from the V and V and verification combined tests

  3. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    MacLean, G.; Fergusson, J.

    1998-01-01

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  4. δ18O water isotope in the iLOVECLIM model (version 1.0 – Part 1: Implementation and verification

    Directory of Open Access Journals (Sweden)

    D. M. Roche

    2013-09-01

    Full Text Available A new 18O stable water isotope scheme is developed for three components of the iLOVECLIM coupled climate model: atmospheric, oceanic and land surface. The equations required to reproduce the fractionation of stable water isotopes in the simplified atmospheric model ECBilt are developed consistently with the moisture scheme. Simplifications in the processes are made to account for the simplified vertical structure including only one moist layer. Implementation of these equations together with a passive tracer scheme for the ocean and a equilibrium fractionation scheme for the land surface leads to the closure of the (isotopic- water budget in our climate system. Following the implementation, verification of the existence of usual δ18O to climatic relationships are performed for the Rayleigh distillation, the Dansgaard relationship and the δ18O –salinity relationship. Advantages and caveats of the approach taken are outlined. The isotopic fields simulated are shown to reproduce most expected oxygen-18–climate relationships with the notable exception of the isotopic composition in Antarctica.

  5. Bio-inspired Artificial Intelligence: А Generalized Net Model of the Regularization Process in MLP

    Directory of Open Access Journals (Sweden)

    Stanimir Surchev

    2013-10-01

    Full Text Available Many objects and processes inspired by the nature have been recreated by the scientists. The inspiration to create a Multilayer Neural Network came from human brain as member of the group. It possesses complicated structure and it is difficult to recreate, because of the existence of too many processes that require different solving methods. The aim of the following paper is to describe one of the methods that improve learning process of Artificial Neural Network. The proposed generalized net method presents Regularization process in Multilayer Neural Network. The purpose of verification is to protect the neural network from overfitting. The regularization is commonly used in neural network training process. Many methods of verification are present, the subject of interest is the one known as Regularization. It contains function in order to set weights and biases with smaller values to protect from overfitting.

  6. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  7. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  8. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  9. Project W-030 flammable gas verification monitoring test

    International Nuclear Information System (INIS)

    BARKER, S.A.

    1999-01-01

    This document describes the verification monitoring campaign used to document the ability of the new ventilation system to mitigate flammable gas accumulation under steady state tank conditions. This document reports the results of the monitoring campaign. The ventilation system configuration, process data, and data analysis are presented

  10. Formal Verification of Annotated Textual Use-Cases

    Czech Academy of Sciences Publication Activity Database

    Šimko, V.; Hauzar, D.; Hnětynka, P.; Bureš, Tomáš; Plášil, F.

    2015-01-01

    Roč. 58, č. 7 (2015), s. 1495-1529 ISSN 0010-4620 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Institutional support: RVO:67985807 Keywords : specification * use-cases * behavior modeling * verification * temporal logic * formalization Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.000, year: 2015

  11. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  12. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle Ida Antoinette; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that

  13. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  14. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  15. A Simulation for the Punchless Piercing Process using Lemaitre Damage Model

    International Nuclear Information System (INIS)

    Lee, Sang Wook; Pourboghrat, Farhang

    2005-01-01

    The punchless piercing is a process that uses highly pressurized fluid instead of the conventional punch to make holes into the sheet metal. This process has many advantages over the conventional method for piercing various shaped holes into very thin strips of metal, composite, etc. An important cost advantage comes from not having to use a punch. Another important advantage comes from the top quality of pierced holes produced by punchless piercing, as no secondary finishing process will be needed for removing burrs typically found in conventional cutting processes. The ABAQUS/Explicit FEM code coupled with Lemaitre damage model has been used to more precisely characterize the punchless piercing process. The formulation adopted for this purpose focuses on the development of an efficient stress integration algorithm and development of a user material subroutine (VUMAT). For verification, the computed results have been compared with those of the experimental results in the literature and shown to be in good agreement with each other. The results obtained from this work are expected to be of significant interest to automotive and aerospace industries interested in using the punchless piercing process

  16. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  17. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  18. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  19. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  20. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...