WorldWideScience

Sample records for integrated verification experiment

  1. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    Energy Technology Data Exchange (ETDEWEB)

    Whitaker, R.W.; Noel, S.D.

    1992-12-01

    The summary report by Tom Weaver gives the overall background for the series of IVE (Integrated Verification Experiment) experiments including information on the full set of measurements made. This appendix presents details of the infrasound data for the and discusses certain aspects of a few special experiments. Prior to FY90, the emphasis of the Infrasound Program was on underground nuclear test (UGT) detection and yield estimation. During this time the Infrasound Program was a separate program at Los Alamos, and it was suggested to DOE/OAC that a regional infrasound network be established around NTS. The IVE experiments took place in a time frame that allowed simultaneous testing of possible network sites and examination of propagation in different directions. Whenever possible, infrasound stations were combined with seismic stations so that a large number could be efficiently fielded. The regional infrasound network was not pursued by DOE, as world events began to change the direction of verification toward non-proliferation. Starting in FY90 the infrasound activity became part of the Source Region Program which has a goal of understanding how energy is transported from the UGT to a variety of measurement locations.

  2. Data storage accounting and verification at LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C. H. [Fermilab; Lanciotti, E. [CERN; Magini, N. [CERN; Ratnikova, N. [Moscow, ITEP; Sanchez-Hernandez, A. [CINVESTAV, IPN; Serfon, C. [Munich U.; Wildish, T. [Princeton U.; Zhang, X. [Beijing, Inst. High Energy Phys.

    2012-01-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  3. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  4. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  5. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  6. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  7. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  8. Data storage accounting and verification in LHC experiments

    CERN Document Server

    Ratnikova ,Natalia

    2012-01-01

    All major experiments at Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for the resource management, planning, and operations. To verify consistency of the central catalogs, experiments are asking sites to provide full list of files they have on storage, including size, checksum, and other file attributes. Such storage dumps provided at regular intervals give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. Developed common solutions help to reduce the maintenance costs both at the large Tier-1 facilities supporting multiple virtual organizations, and at the small sites that often lack manpower. We discuss requirements...

  9. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  10. Verification and validation as an integral part of the development of digital systems for nuclear applications

    International Nuclear Information System (INIS)

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  11. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  12. The joint verification experiments as a global non-proliferation exercise

    International Nuclear Information System (INIS)

    Shaner, J.W.

    1998-01-01

    This conference commemorates the 10th anniversary of the second of two Joint Verification Experiments conducted by the Soviet Union and the US. These two experiments, one at the Nevada test site in the US, and the second here at the Semipalatinsk test site were designed to test the verification of a nuclear testing treaty limiting the size underground explosions to 150 kilotons. By building trust and technical respect between the weapons scientists of the two most powerful adversaries, the Joint Verification Experiment (JVE) had the unanticipated result of initiating a suite of cooperative projects and programs aimed at reducing the Cold War threats and preventing the proliferation of weapons of mass destruction

  13. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  14. IAEA verification experiment at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Gordon, D.M.; Subudhi, M.; Calvert, O.L.; Bonner, T.N.; Cherry, R.C.; Whiting, N.E.

    1998-01-01

    In April 1996, the United States (US) added the Portsmouth Gaseous Diffusion Plant to the list of facilities eligible for the application of International Atomic Energy Agency (IAEA) safeguards. At that time, the US proposed that the IAEA carry out a Verification Experiment at the plant with respect to the downblending of about 13 metric tons of highly enriched uranium (HEU) in the form of UF 6 . This material is part of the 226 metric tons of fissile material that President Clinton has declared to be excess to US national-security needs and which will be permanently withdrawn from the US nuclear stockpile. In September 1997, the IAEA agreed to carry out this experiment, and during the first three weeks of December 1997, the IAEA verified the design information concerning the downblending process. The plant has been subject to short-notice random inspections since December 17, 1997. This paper provides an overview of the Verification Experiment, the monitoring technologies used in the verification approach, and some of the experience gained to date

  15. Top-down design and verification methodology for analog mixed-signal integrated circuits

    NARCIS (Netherlands)

    Beviz, P.

    2016-01-01

    The current report contains the introduction of a novel Top-Down Design and Verification methodology for AMS integrated circuits. With the introduction of new design and verification flow, more reliable and efficient development of AMS ICs is possible. The assignment incorporated the research on the

  16. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  17. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  18. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Science.gov (United States)

    2013-05-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Hazardous Materials Safety Administration, DOT. ACTION: Notice of public meeting. SUMMARY: This notice is announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity...

  19. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  20. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  1. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    Science.gov (United States)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  2. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  3. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  4. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  5. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  6. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  7. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    Science.gov (United States)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  8. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  9. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  10. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  11. Engineering within the assembly, verification, and integration (AIV) process in ALMA

    Science.gov (United States)

    Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene

    2010-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered

  12. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  13. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  14. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  15. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  16. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  17. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    safety analyses for the SMART design have been performed and the results demonstrated that the key safety parameters of the limiting design base events do not violate the safety limits. Various fundamental thermal-hydraulic experiments were carried out during the design concept development to assure the fundamental behavior of major concepts of the SMART systems. Most technologies implemented into the SMART concept have been proven through the design and operation of the existing PWRs. Advanced design features require tests to confirm the performance of the design and to produce data for the design code verification. Tests including core flow distribution test, test of flow instability in steam generator, test of self-pressurizer performance, two-phase critical flow test with non-condensable gases, and high-temperature/high pressure integral thermal-hydraulic test are currently under preparation by installing equipment and facilities. The performance tests for key parts of MCP(Main Coolant pump) and CEDM(Control Element Drive mechanism) were performed. And also mechanical performance tests for the reactor assembly and major primary components will be carried out. Technical and economical evaluation for the commercialization of SMART were conducted by the Korean Nuclear Society (KNS) from August of 2000 to July 2001. Based on the results of the evaluation, the SMART technology is technically sound and has sufficient economic incentives for pursuing further development. Upon the completion of the basic design phase in March of 2002, the SMART design/engineering verification phase will be thus followed to conduct various separate effect tests and comprehensive integral tests as well as construction of the one fifth scaled pilot plant for demonstration of overall SMART performance. (author)

  18. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  19. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  20. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  1. Experiences in the formalisation and verification of medical protocols

    OpenAIRE

    Balser, Michael

    2003-01-01

    Experiences in the formalisation and verification of medical protocols / M. Balser ... - In: Artificial intelligence in medicine : 9th Conference on Artificial Intelligence in Medicine in Europe, AIME 2003, Protaras, Cyprus, October 18 - 22, 2003 ; proceedings / Michel Dojat ... (eds.). - Berlin u.a. : Springer, 2003. - S. 132-141. - (Lecture notes in computer science ; 2780 : Lecture notes in artificial intelligence)

  2. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  3. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  4. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  5. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  6. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  7. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  8. Experience in verification regimes. United States On-Site Inspection Agency

    International Nuclear Information System (INIS)

    Reppert, J.

    1998-01-01

    Experiences are described of the United States On-site Inspection Agency in verification regimes all over the world where it has been applied in the last 30 years. The challenge for the future is to extend the benefits of the applied tools to all states in all regions to enhance stability and to create conditions for peace at lower levels of armaments than currently exist. The USA need to engage states currently caught in cycles of violence and arms escalation. They must examine technologies which together with on-site aspects of verification or transparency regimes can provide a comprehensive picture at affordable costs. They foresee a growth in combined training with new states entering for the first time into regime that include arms control and transparency measure

  9. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Directory of Open Access Journals (Sweden)

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  10. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  11. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  12. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  13. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    International Nuclear Information System (INIS)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V.

    2005-01-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  14. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    Energy Technology Data Exchange (ETDEWEB)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V. [NSI RRC ' Kurchatov Institute' , Kurchatov Sq., 1, Moscow, 123182 (Russian Federation)

    2005-07-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  15. Verification of Monte Carlo transport codes by activation experiments

    OpenAIRE

    Chetvertkova, Vera

    2013-01-01

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is...

  16. The Mailbox Computer System for the IAEA verification experiment on HEU downblending at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Aronson, A.L.; Gordon, D.M.

    2000-01-01

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BY THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA

  17. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  18. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  19. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yilin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  20. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  1. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  2. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  3. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  4. DABIE: a data banking system of integral experiments for reactor core characteristics computer codes

    International Nuclear Information System (INIS)

    Matsumoto, Kiyoshi; Naito, Yoshitaka; Ohkubo, Shuji; Aoyanagi, Hideo.

    1987-05-01

    A data banking system of integral experiments for reactor core characteristics computer codes, DABIE, has been developed to lighten the burden on searching so many documents to obtain experiment data required for verification of reactor core characteristics computer code. This data banking system, DABIE, has capabilities of systematic classification, registration and easy retrieval of experiment data. DABIE consists of data bank and supporting programs. Supporting programs are data registration program, data reference program and maintenance program. The system is designed so that user can easily register information of experiment systems including figures as well as geometry data and measured data or obtain those data through TSS terminal interactively. This manual describes the system structure, how-to-use and sample uses of this code system. (author)

  5. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  6. CTBT verification-related technologies for peaceful purposes: the French experiences of international cooperation

    International Nuclear Information System (INIS)

    Massinon, B.

    1999-01-01

    The French experience concerning CTBT verification-related technologies for peaceful purposes as well a the international cooperation in this field are presented. Possible objectives and cooperation program needs are cited. French experience in international cooperation is related to seismology and seismic hazards in Bolivia, Madagascar, Nepal and Indonesia and is considered as very constructive, meaning that technical experience is developed, consistent scientific results are obtained, and a considerable yield to the CTBT task is achieved. Large scientific benefits are expected from the CTBTO

  7. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  8. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    International Nuclear Information System (INIS)

    Magazzù, G; Borgese, G; Costantino, N; Fanucci, L; Saponara, S; Incandela, J

    2013-01-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  9. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  10. The verification basis of the PM-ALPHA code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Angelini, S. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the PM-ALPHA code is presented and implemented. The approach consists of a stepwise testing procedure focused principally on the multifield aspects of the premixing phenomenon. Breakup is treated empirically, but it is shown that, through reasonable choices of the breakup parameters, consistent interpretations of existing integral premixing experiments can be obtained. The present capability is deemed adequate for bounding energetics evaluations. (author)

  11. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  12. Sealing of process valves for the HEU downblending verification experiment at Portsmouth

    International Nuclear Information System (INIS)

    Baldwin, G.T.; Bartberger, J.C.; Jenkins, C.D.; Perlinski, A.W.; Schoeneman, J.L.; Gordon, D.M.; Whiting, N.E.; Bonner, T.N.; Castle, J.M.

    1998-01-01

    At the Portsmouth Gaseous Diffusion Plant in Piketon, Ohio, USA, excess inventory of highly-enriched uranium (HEU) from US defense programs is being diluted to low-enriched uranium (LEU) for commercial use. The conversion is subject to a Verification Experiment overseen by the International Atomic Energy Agency (IAEA). The Verification Experiment is making use of monitoring technologies developed and installed by several DOE laboratories. One of the measures is a system for sealing valves in the process piping, which secures the path followed by uranium hexafluoride gas (UF 6 ) from cylinders at the feed stations to the blend point, where the HEU is diluted with LEU. The Authenticated Item Monitoring System (AIMS) was the alternative proposed by Sandia National Laboratories that was selected by the IAEA. Approximately 30 valves were sealed by the IAEA using AIMS fiber-optic seals (AFOS). The seals employ single-core plastic fiber rated to 125 C to withstand the high-temperature conditions of the heated piping enclosures at Portsmouth. Each AFOS broadcasts authenticated seal status and state-of-health messages via a tamper-protected radio-frequency transmitter mounted outside of the heated enclosure. The messages are received by two collection stations, operated redundantly

  13. US monitoring and verification technology: on-site inspection experience and future challenges

    International Nuclear Information System (INIS)

    Gullickson, R.L.; Carlson, D.; Ingraham, J.; Laird, B.

    2013-01-01

    The United States has a long and successful history of cooperation with treaty partners in monitoring and verification. For strategic arms reduction treaties, our collaboration has resulted in the development and application of systems with limited complexity and intrusiveness. As we progress beyond New START (NST) along the 'road to zero', the reduced number of nuclear weapons is likely to require increased confidence in monitoring and verification techniques. This may place increased demands on the technology to verify the presence of a nuclear weapon and even confirm the presence of a certain type. Simultaneously, this technology must include the ability to protect each treaty partner's sensitive nuclear weapons information. Mutual development of this technology by treaty partners offers the best approach for acceptance in treaty negotiations. This same approach of mutual cooperation and development is essential for developing nuclear test monitoring technology in support of the Comprehensive Nuclear Test Ban Treaty (CTBT). Our ability to detect low yield and evasive testing will be enhanced through mutually developed techniques and experiments using laboratory laser experiments and high explosives tests in a variety of locations and geologies. (authors)

  14. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  15. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  16. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  17. Automated radiotherapy treatment plan integrity verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Moore, Kevin L. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, St. Louis, Missouri 63110 (United States)

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  18. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  19. Integration of SPICE with TEK LV500 ASIC Design Verification System

    Directory of Open Access Journals (Sweden)

    A. Srivastava

    1996-01-01

    Full Text Available The present work involves integration of the simulation stage of design of a VLSI circuit and its testing stage. The SPICE simulator, TEK LV500 ASIC Design Verification System, and TekWaves, a test program generator for LV500, were integrated. A software interface in ‘C’ language in UNIX ‘solaris 1.x’ environment has been developed between SPICE and the testing tools (TekWAVES and LV500. The function of the software interface developed is multifold. It takes input from either SPICE2G.6 or SPICE 3e.1. The output generated by the interface software can be given as an input to either TekWAVES or LV500. A graphical user interface has also been developed with OPENWlNDOWS using Xview tool kit on SUN workstation. As an example, a two phase clock generator circuit has been considered and usefulness of the software demonstrated. The interface software could be easily linked with VLSI design such as MAGIC layout editor.

  20. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  1. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  2. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  3. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  4. Three-dimensional thermal hydraulic best estimate code BAGIRA: new results of verification

    International Nuclear Information System (INIS)

    Peter Kohut; Sergey D Kalinichenko; Alexander E Kroshilin; Vladimir E Kroshilin; Alexander V Smirnov

    2005-01-01

    Full text of publication follows: BAGIRA is a three-dimensional inhomogeneous two-velocity two-temperature thermal hydraulic code of best estimate, elaborated in VNIIAES for modeling two-phase flows in the primary circuit and steam generators of VVER-type nuclear reactors under various accident, transient or normal operation conditions. In this talk we present verification results of the BAGIRA code, obtained on the basis of different experiments performed on special and integral thermohydraulic experimental facilities as well as on real NPPs. Special attention is paid to the verification of three-dimensional flow models. Besides that we expose new results of the code benchmark analysis made on the basis of two recent LOCA-type experiments - 'Leak 2 x 25% from the hot leg double-side rupture' and 'Leak 3% from the cold leg' - performed on the PSB-VVER integral test facility (Electrogorsk Research and Engineering Center, Electrogorsk, Russia) - the most up-to-date Russian large-scale four-loop unit which has been designed for modelling the primary circuit of VVER-1000 type reactors. (authors)

  5. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  6. Integrated safeguards: Australian views and experience

    International Nuclear Information System (INIS)

    Carlson, J.; Bragin, V.; Leslie, R.

    2001-01-01

    Full text: Australia has had a pioneering role in assisting the IAEA to develop the procedures and methods for strengthened safeguards, both before and after the conclusion of Australia's additional protocol. Australia played a key role in the negotiation of the model additional protocol, and made ratification a high priority in order to encourage early ratification by other States. Australia was the first State to ratify an additional protocol, on 10 December 1997, and was the first State in which the IAEA exercised complementary access and managed access under an additional protocol. Australia has undergone three full cycles of evaluation under strengthened safeguards measures, enabling the Agency to conclude it was appropriate to commence implementation of integrated safeguards. In January 2001 Australia became the first State in which integrated safeguards are being applied. As such, Australia's experience will be of interest to other States as they consult with the IAEA on the modalities for the introduction of integrated safeguards in their jurisdictions. The purpose of the paper is to outline Australia's experience with strengthened safeguards and Australia's views on the implementation of integrated safeguards. Australia has five Material Balance Areas (MBAs), the principal one covering the 10 MWt research reactor at Lucas Heights and the associated inventory of fresh and irradiated HEU fuel. Under classical safeguards, generally Australia was subject to annual Physical Inventory Verifications (PIVs) for the four MBAs at Lucas Heights, plus quarterly interim inspections, making a total of four inspections a year (PIVs for the different MBAs were conducted concurrently with each other or with interim inspections in other MBAs), although there was a period when the fresh fuel inventory exceeded one SQ, requiring monthly inspections. Under strengthened safeguards, this pattern of four inspections a year was maintained, with the addition of complementary

  7. Development of a data bank system for LWR integral experiment

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Aoyagi, Hideo

    1983-01-01

    A data bank system for LWR integral experiment has been developed for the purpose of alleviating various efforts associated with the verification of computer codes. The final aim of this system is such that the imput data for the code to be verified can be easily obtained, and the results of calculation can be obtained in the form of the comparison with measurement. Geometry and material composition as well as measured data are stored in the data bank. This data bank system is composed of four sub-programs; (1) registration program, (2) information retrieval program, (3) maintenance program, and (4) figure representation program. In this report, the structure of this data bank system and how to use the system are explained. An example of the use of this system is also included. (Aoki, K.)

  8. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  9. U.S. integral and benchmark experiments

    International Nuclear Information System (INIS)

    Maienschein, F.C.

    1978-01-01

    Verification of methods for analysis of radiation-transport (shielding) problems in Liquid-Metal Fast Breeder Reactors has required a series of experiments that can be classified as benchmark, parametric, or design-confirmation experiments. These experiments, performed at the Oak Ridge Tower Shielding Facility, have included measurements of neutron transport in bulk shields of sodium, steel, and inconel and in configurations that simulate lower axial shields, pipe chases, and top-head shields. They have also included measurements of the effects of fuel stored within the reactor vessel and of gamma-ray energy deposition (heating). The paper consists of brief comments on these experiments, and also on a recent experiment in which neutron streaming problems in a Gas-Cooled Fast Breeder Reactor were studied. The need for additional experiments for a few areas of LMFBR shielding is also cited

  10. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  11. Experiences in Building Python Automation Framework for Verification and Data Collections

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available

    This paper describes our experiences in building a Python automation framework. Specifically, the automation framework is used to support verification and data collection scripts. The scripts control various test equipments in addition to the device under test (DUT to characterize a specific performance with a specific configuration or to evaluate the correctness of the behaviour of the DUT. The specific focus on this paper is on documenting our experiences in building an automation framework using Python: on the purposes, goals and the benefits, rather than on a tutorial of how to build such a framework.

  12. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  13. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  14. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  15. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  16. Integrated Virtual Environment Test Concepts and Objectives

    National Research Council Canada - National Science Library

    Tackett, Gregory

    2001-01-01

    ...), a series of integration and verification tests were conducted to provide development milestones for the simulation architecture and tools that would be needed for the full-up live/virtual field experiment...

  17. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable.

  18. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    International Nuclear Information System (INIS)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk

    2016-01-01

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable

  19. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  20. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  1. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  2. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  3. Integration of KESS III models in ATHLET-CD and contributions to program verification. Final report

    International Nuclear Information System (INIS)

    Bruder, M.; Schatz, A.

    1994-07-01

    The development of the computer code ATHLET-CD is a contribution to the reactor safety research. ATHLET-CD is an extension of the system code ATHLET by core degradation models especially of the modular software package KESS. The aim of the ATHLET-CD development is the simulation of severe accident sequences from their initialisation to severe core degradation in a continous manner. In the framework of this project the ATHLET-CD development has been focused on the integration of KESS model like the control rod model as well as the models describing chemical interactions and material relocation along a rod and fission product release. The present ATHLET-CD version is able to describe severe accidents in a PWR up to the early core degradation (relocation of material along a rod surface in axial direction). Contributions to the verification of ATHLET-CD comprised calculations of the experiments PHEBUS AIC and PBF SFD 1-4. The PHEBUS AIC calculation was focused on the examination of the control rod model whereas the PBF SFD 1-4 claculation served to check the models describing melting, material relocation and fission product release. (orig.)

  4. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  5. Testing and verification of a novel single-channel IGBT driver circuit

    OpenAIRE

    Lukić, Milan; Ninković, Predrag

    2016-01-01

    This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new d...

  6. OECD/NEA data bank scientific and integral experiments databases in support of knowledge preservation and transfer

    International Nuclear Information System (INIS)

    Sartori, E.; Kodeli, I.; Mompean, F.J.; Briggs, J.B.; Gado, J.; Hasegawa, A.; D'hondt, P.; Wiesenack, W.; Zaetta, A.

    2004-01-01

    The OECD/Nuclear Energy Data Bank was established by its member countries as an institution to allow effective sharing of knowledge and its basic underlying information and data in key areas of nuclear science and technology. The activities as regards preserving and transferring knowledge consist of the: 1) Acquisition of basic nuclear data, computer codes and experimental system data needed over a wide range of nuclear and radiation applications; 2) Independent verification and validation of these data using quality assurance methods, adding value through international benchmark exercises, workshops and meetings and by issuing relevant reports with conclusions and recommendations, as well as by organising training courses to ensure their qualified and competent use; 3) Dissemination of the different products to authorised establishments in member countries and collecting and integrating user feedback. Of particular importance has been the establishment of basic and integral experiments databases and the methodology developed with the aim of knowledge preservation and transfer. Databases established thus far include: 1) IRPhE - International Reactor Physics Experimental Benchmarks Evaluations, 2) SINBAD - a radiation shielding experiments database (nuclear reactors, fusion neutronics and accelerators), 3) IFPE - International Fuel Performance Benchmark Experiments Database, 4) TDB - The Thermochemical Database Project, 5) ICSBE - International Nuclear Criticality Safety Benchmark Evaluations, 6) CCVM - CSNI Code Validation Matrix of Thermal-hydraulic Codes for LWR LOCA and Transients. This paper will concentrate on knowledge preservation and transfer concepts and methods related to some of the integral experiments and TDB. (author)

  7. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    OpenAIRE

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  8. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  9. Ride comfort optimization of a multi-axle heavy motorized wheel dump truck based on virtual and real prototype experiment integrated Kriging model

    Directory of Open Access Journals (Sweden)

    Bian Gong

    2015-06-01

    Full Text Available The optimization of hydro-pneumatic suspension parameters of a multi-axle heavy motorized wheel dump truck is carried out based on virtual and real prototype experiment integrated Kriging model in this article. The root mean square of vertical vibration acceleration, in the center of sprung mass, is assigned as the optimization objective. The constraints are the natural frequency, the working stroke, and the dynamic load of wheels. The suspension structure for the truck is the adjustable hydro-pneumatic suspension with ideal vehicle nonlinear characteristics, integrated with elastic and damping elements. Also, the hydraulic systems of two adjacent hydro-pneumatic suspension are interconnected. Considering the high complexity of the engineering model, a novel kind of meta-model called virtual and real prototype experiment integrated Kriging is proposed in this article. The interpolation principle and the construction of virtual and real prototype experiment integrated Kriging model were elucidated. Being different from traditional Kriging, virtual and real prototype experiment integrated Kriging combines the respective advantages of actual test and Computer Aided Engineering simulation. Based on the virtual and real prototype experiment integrated Kriging model, the optimization results, obtained by experimental verification, showed significant improvement in the ride comfort by 12.48% for front suspension and 11.79% for rear suspension. Compared with traditional Kriging, the optimization effect was improved by 3.05% and 3.38% respectively. Virtual and real prototype experiment integrated Kriging provides an effective way to approach the optimal solution for the optimization of high-complexity engineering problems.

  10. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  11. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  12. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  13. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  14. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  15. The concept verification testing of materials science payloads

    Science.gov (United States)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  16. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  17. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  18. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  19. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  20. Expose : procedure and results of the joint experiment verification tests

    Science.gov (United States)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  1. Sustainable Development Impacts of NAMAs: An integrated approach to assessment of co-benefits based on experience with the CDM

    DEFF Research Database (Denmark)

    Olsen, Karen Holm

    to assess the SD impacts of NAMAs. This paper argues for a new integrated approach to asses NAMAs' SD impacts that consists of SD indicators, procedures for stakeholder involvement and safeguards against negative impacts. The argument is based on a review of experience with the CDM’s contribution to SD...... and a comparison of similarities and differences between NAMAs and CDM. Five elements of a new approach towards assessment of NAMAs SD impacts are suggested based on emerging approaches and methodologies for monitoring, reporting and verification (MRV) of greenhouse gas reductions and SD impacts of NAMAs....

  2. Integrating software testing and run-time checking in an assertion verification framework

    OpenAIRE

    Mera, E.; López García, Pedro; Hermenegildo, Manuel V.

    2009-01-01

    We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional...

  3. Testing and verification of a novel single-channel IGBT driver circuit

    Directory of Open Access Journals (Sweden)

    Lukić Milan

    2016-01-01

    Full Text Available This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new designs. It is a part of new 20kW industrial-grade boost converter.

  4. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  5. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  6. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  7. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  8. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  9. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  10. Verification and validation guidelines for high integrity systems: Appendices A--D, Volume 2

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    The following material is furnished as an experimental guide for the use of risk based classification for nuclear plant protection systems. As shown in Sections 2 and 3 of this report, safety classifications for the nuclear field are application based (using the function served as the primary criterion), whereas those in use by the process industry and the military are risk based. There are obvious obstacles to the use of risk based classifications (and the associated integrity levels) for nuclear power plants, yet there are also many potential benefits, including: it considers all capabilities provided for dealing with a specific hazard, thus assigning a lower risk where multiple protection is provided (either at the same or at lower layers); this permits the plant management to perform trade-offs between systems that meet the highest qualification levels or multiple diverse systems at lower qualification levels; it motivates the use (and therefore also the development) of protection systems with demonstrated low failure probability; and it may permit lower cost process industry equipment of an established integrity level to be used in nuclear applications (subject to verification of the integrity level and regulatory approval). The totality of these benefits may reduce the cost of digital protection systems significantly an motivate utilities to much more rapid upgrading of the capabilities than is currently the case. Therefore the outline of a risk based classification is presented here, to serve as a starting point for further investigation and possible trial application

  11. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  12. Development of small and medium integral reactor. ctor Development of fluid system design for small and medium integral reactor

    International Nuclear Information System (INIS)

    Lee, D. J.; Chang, M. H.; Kim, K. K.; Kim, J. P.; Yoon, J. H.; Lee, Y. J.; Park, C. T.; Bae, Y. Y.; Kang, D. J.; Lee, K. H.; Lee, J.; Kim, H. Y.; Cho, B. H.; Seo, J. K.; Kang, K. S.; Kang, H. O.

    1997-07-01

    The purpose of this study is to develop system design technology of integral reactor, as a new design concept of small and medium reactor having enhanced safety and economy, and to have a design assessment / verification technology through basic thermal hydraulic experiments. This report describes of the following: 1) basic requirement for the integral reactor system design 2) Conceptual design of primary and secondary circuits of NSSS, emergency core cooling system, passive residual heat removal system, severe accident mitigation cooling system, passive residual heat removal system, severe accident mitigation system and other auxiliary system 3) Requirements and test program for the basic thermal hydraulic experiments including, CHF test for hexagonal fuel assembly, flow instability for once-through steam generator, core flow distribution test and verification test for non-condensable gas model in RELAP-5 code. The results of this study can be utilized for using as the foundation technology of in the next basic design phase and design technology for future advanced reactors. (author). 30 refs.,24 tabs., 56 figs

  13. Development of small and medium integral reactor. ctor Development of fluid system design for small and medium integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D. J.; Chang, M. H.; Kim, K. K.; Kim, J. P.; Yoon, J. H.; Lee, Y. J.; Park, C. T.; Bae, Y. Y.; Kang, D. J.; Lee, K. H.; Lee, J.; Kim, H. Y.; Cho, B. H.; Seo, J. K.; Kang, K. S.; Kang, H. O.

    1997-07-01

    The purpose of this study is to develop system design technology of integral reactor, as a new design concept of small and medium reactor having enhanced safety and economy, and to have a design assessment / verification technology through basic thermal hydraulic experiments. This report describes of the following: (1) basic requirement for the integral reactor system design (2) Conceptual design of primary and secondary circuits of NSSS, emergency core cooling system, passive residual heat removal system, severe accident mitigation cooling system, passive residual heat removal system, severe accident mitigation system and other auxiliary system (3) Requirements and test program for the basic thermal hydraulic experiments including, CHF test for hexagonal fuel assembly, flow instability for once-through steam generator, core flow distribution test and verification test for non-condensable gas model in RELAP-5 code. The results of this study can be utilized for using as the foundation technology of in the next basic design phase and design technology for future advanced reactors. (author). 30 refs.,24 tabs., 56 figs.

  14. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  15. The regional energy integration: the latin-american experiences; L'integration energetique regionale: les experiences latino-americaines

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  16. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  17. Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques

    National Research Council Canada - National Science Library

    Fridrich, Jessica

    2002-01-01

    In this report, we describe an algorithm for robust visual hash functions with applications to digital image watermarking for authentication and integrity verification of video data and still images...

  18. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  19. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  20. BEval: A Plug-in to Extend Atelier B with Current Verification Technologies

    Directory of Open Access Journals (Sweden)

    Valério Medeiros Jr.

    2014-01-01

    Full Text Available This paper presents BEval, an extension of Atelier B to improve automation in the verification activities in the B method or Event-B. It combines a tool for managing and verifying software projects (Atelier B and a model checker/animator (ProB so that the verification conditions generated in the former are evaluated with the latter. In our experiments, the two main verification strategies (manual and automatic showed significant improvement as ProB's evaluator proves complementary to Atelier B built-in provers. We conducted experiments with the B model of a micro-controller instruction set; several verification conditions, that we were not able to discharge automatically or manually with AtelierB's provers, were automatically verified using BEval.

  1. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  2. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  3. Issues of verification and validation of application-specific integrated circuits in reactor trip systems

    International Nuclear Information System (INIS)

    Battle, R.E.; Alley, G.T.

    1993-01-01

    Concepts of using application-specific integrated circuits (ASICs) in nuclear reactor safety systems are evaluated. The motivation for this evaluation stems from the difficulty of proving that software-based protection systems are adequately reliable. Important issues concerning the reliability of computers and software are identified and used to evaluate features of ASICS. These concepts indicate that ASICs have several advantages over software for simple systems. The primary advantage of ASICs over software is that verification and validation (V ampersand V) of ASICs can be done with much higher confidence than can be done with software. A method of performing this V ampersand V on ASICS is being developed at Oak Ridge National Laboratory. The purpose of the method's being developed is to help eliminate design and fabrication errors. It will not solve problems with incorrect requirements or specifications

  4. Technology Integration Experiences of Teachers

    Science.gov (United States)

    Çoklar, Ahmet Naci; Yurdakul, Isil Kabakçi

    2017-01-01

    Teachers are important providers of educational sustainability. Teachers' ability to adapt themselves to rapidly developing technologies applicable to learning environments is connected with technology integration. The purpose of this study is to investigate teachers' technology integration experiences in the course of learning and teaching…

  5. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  6. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  7. The regional energy integration: the latin-american experiences; L'integration energetique regionale: les experiences latino-americaines

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  8. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  9. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  10. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  11. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  12. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  13. Statistics and integral experiments in the verification of LOCA calculations models

    International Nuclear Information System (INIS)

    Margolis, S.G.

    1978-01-01

    The LOCA (loss of coolant accident) is a hypothesized, low-probability accident used as a licensing basis for nuclear power plants. Computer codes which have been under development for at least a decade have been the principal tools used to assess the consequences of the hypothesized LOCA. Models exist in two versions. In EM's (Evaluation Models) the basic engineering calculations are constrained by a detailed set of assumptions spelled out in the Code of Federal Regulations (10 CFR 50, Appendix K). In BE Models (Best Estimate Models) the calculations are based on fundamental physical laws and available empirical correlations. Evaluation models are intended to have a pessimistic bias; Best Estimate Models are intended to be unbiased. Because evaluation models play a key role in reactor licensing, they must be conservative. A long-sought objective has been to assess this conservatism by combining Best Estimate Models with statisticallly established error bounds, based on experiment. Within the last few years, an extensive international program of LOCA experiments has been established to provide the needed data. This program has already produced millions of measurements of temperature, density, and flow and millions of more measurements are yet to come

  14. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  15. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  16. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  17. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  18. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  19. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  20. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  1. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    already awash in fissile material and is increasingly threatened by the possible consequences of illicit trafficking in such material. The chemical field poses fewer problems. The ban on chemical weapons is a virtually complete post-Cold War regime, with state-of-the-art concepts and procedures of verification resulting from decades of negotiation. The detection of prohibited materials and activities is the common goal of the nuclear and chemical regimes for which the most intrusive and intensive procedures are activated by the three organizations. Accounting for the strictly peaceful application of dual-use items constitutes the bulk of the work of the inspectorates at the IAEA and the OPCW. A common challenge in both fields is the advance of science and technology in the vast nuclear and chemical industries and the ingenuity of some determined proliferators to deceive by concealing illicit activities under legitimate ones. Inspection procedures and technologies need to keep up with the requirement for flexibility and adaptation to change. The common objective of the three organizations is to assemble and analyze all relevant information in order to conclude reliably whether a State is or is not complying with its treaty obligations. The positive lessons learned from the IAEA's verification experience today are valuable in advancing concepts and technologies that might also benefit the other areas of WMD verification. Together with the emerging, more comprehensive verification practice of the OPCW, they may provide a useful basis for developing common standards, which may in turn help in evaluating the cost-effectiveness of verification methods for the Biological and Toxin Weapons Convention and other components of a WMD control regime

  2. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  3. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  4. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  5. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  6. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  7. Symposium on International Safeguards: Preparing for Future Verification Challenges

    International Nuclear Information System (INIS)

    2010-01-01

    The purpose of the symposium is to foster dialogue and information exchange involving Member States, the nuclear industry and members of the broader nuclear non-proliferation community to prepare for future verification challenges. Topics addressed during the 2010 symposium include the following: - Supporting the global nuclear non-proliferation regime: Building support for strengthening international safeguards; Enhancing confidence in compliance with safeguards obligations; Legal authority as a means to enhance effectiveness and efficiency; Verification roles in support of arms control and disarmament. - Building collaboration and partnerships with other international forums: Other verification and non-proliferation regimes; Synergies between safety, security and safeguards regimes. - Improving cooperation between IAEA and States for safeguards implementation: Strengthening State systems for meeting safeguards obligations; Enhancing safeguards effectiveness and efficiency through greater cooperation; Lessons learned: recommendations for enhancing integrated safeguards implementation. - Addressing safeguards challenges in an increasingly interconnected world: Non-State actors and covert trade networks; Globalization of nuclear information and technology. - Preparing for the global nuclear expansion and increasing safeguards workload: Furthering implementation of the State-level concept and integrated safeguards; Information-driven safeguards; Remote data-driven safeguards inspections; Safeguards in States without comprehensive safeguards agreements. - Safeguarding advanced nuclear facilities and innovative fuel cycles: Proliferation resistance; Safeguards by design; Safeguards approaches for advanced facilities. - Advanced technologies and methodologies: For verifying nuclear material and activities; For detecting undeclared nuclear material and activities; For information collection, analysis and integration. - Enhancing the development and use of safeguards

  8. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  9. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    International Nuclear Information System (INIS)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh

    1995-07-01

    This is the '94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author)

  10. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  11. The developments and verifications of trace model for IIST LOCA experiments

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, W. X. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Wang, J. R.; Lin, H. T. [Inst. of Nuclear Energy Research, Taiwan, No. 1000, Wenhua Rd., Longtan Township, Taoyuan County 32546, Taiwan (China); Shih, C.; Huang, K. C. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Dept. of Engineering and System Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2012-07-01

    The test facility IIST (INER Integral System Test) is a Reduced-Height and Reduced-Pressure (RHRP) integral test loop, which was constructed for the purposes of conducting thermal hydraulic and safety analysis of the Westinghouse three-loop PWR Nuclear Power Plants. The main purpose of this study is to develop and verify TRACE models of IIST through the IIST small break loss of coolant accident (SBLOCA) experiments. First, two different IIST TRACE models which include a pipe-vessel model and a 3-D vessel component model have been built. The steady state and transient calculation results show that both TRACE models have the ability to simulate the related IIST experiments. Comparing with IIST SBLOCA experiment data, the 3-D vessel component model has shown better simulation capabilities so that it has been chosen for all further thermal hydraulic studies. The second step is the sensitivity studies of two phase multiplier and subcooled liquid multiplier in choked flow model; and two correlation constants in CCFL model respectively. As a result, an appropriate set of multipliers and constants can be determined. In summary, a verified IIST TRACE model with 3D vessel component, and fine-tuned choked flow model and CCFL model is established for further studies on IIST experiments in the future. (authors)

  12. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  13. Examining the examiners: an online eyebrow verification experiment inspired by FISWG

    NARCIS (Netherlands)

    Zeinstra, Christopher Gerard; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    In forensic face comparison, one of the features taken into account are the eyebrows. In this paper, we investigate human performance on an eyebrow verification task. This task is executed twice by participants: a "best-effort" approach and an approach using features based on forensic knowledge. The

  14. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  15. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  16. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  17. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  18. Analytical verification of requirements for safe and timely lay-down of ...

    African Journals Online (AJOL)

    Analytical verification of requirements for safe and timely lay-down of an offshore slay pipeline abandonment head during some pipe-lay stops: a case study of Forcados Yokri integrated pipeline project in Nigerian shallow offshore.

  19. Mathematical Verification for Transmission Performance of Centralized Lightwave WDM-RoF-PON with Quintuple Services Integrated in Each Wavelength Channel

    Directory of Open Access Journals (Sweden)

    Shuai Chen

    2015-01-01

    Full Text Available Wavelength-division-multiplexing passive-optical-network (WDM-PON has been recognized as a promising solution of the “last mile” access as well as multibroadband data services access for end users, and WDM-RoF-PON, which employs radio-over-fiber (RoF technique in WDM-PON, is even a more attractive approach for future broadband fiber and wireless access for its strong availability of centralized multiservices transmission operation and its transparency for bandwidth and signal modulation formats. As for multiservices development in WDM-RoF-PON, various system designs have been reported and verified via simulation or experiment till now, and the scheme with multiservices transmitted in each single wavelength channel is believed as the one that has the highest bandwidth efficiency; however, the corresponding mathematical verification is still hard to be found in state-of-the-art literature. In this paper, system design and data transmission performance of a quintuple services integrated WDM-RoF-PON which jointly employs carrier multiplexing and orthogonal modulation techniques, have been theoretically analyzed and verified in detail; moreover, the system design has been duplicated and verified experimentally and the theory system of such WDM-RoF-PON scheme has thus been formed.

  20. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  1. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  2. Role of experiments in soil-structure interaction methodology verification

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1986-01-01

    Different kinds of experimental data may be useful for partial or full verification of SSI analysis methods. The great bulk of existing data comes from earthquake records and dynamic testing of as-built structures. However, much of this data may not be suitable for the present purpose as the measurement locations were not selected with the verification of SSI analysis in mind and hence are too few in number or inappropriate in character. Data from scale model testing that include the soil in the model - both in-situ and laboratory - are relatively scarce. If the difficulty in satisfying the requirements of similitude laws on the one hand and simulating realistic soil behavior on the other can be resolved, scale model testing may generate very useful data for relatively low cost. The current NRC sponsored programs are expected to generate data very useful for verifying analysis methods for SSI. A systematic effort to inventory, evaluate and classify existing data is first necessary. This effort would probably show that more data is needed for the better understanding of SSI aspects such as spatial variation of ground motion and the related issue of foundation input motion, and soil stiffness. Collection of response data from in-structure and free field (surface and downhole) through instrumentation of selected as-built structures in seismically active regions may be the most efficient way to obtain the needed data. Augmentation of this data from properly designed scale model tests should also be considered

  3. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  4. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  5. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  6. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  7. Engineering Physics Division integral experiments and their analyses

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    Integral experiments are performed as part of the Engineering Physics Division's on-going research in the development and application of radiation shielding methods. Integral experiments performed at the Oak Ridge Electron Linear Accelerator (ORELA) under the Division's Magnetic Fusion program are designed to provide data against which ORNL and all other organizations involved in shielding calculations for fusion devices can test their calculational methods and interaction data. The Tower Shielding Facility (TSF) continues to be the primary source of integral data for fission reactor shielding design. The experiments performed at the TSF during the last few years have been sponsored by the Gas Cooled Fast Reactor (GCFR) program. During this report period final documentation was also prepared for the remaining LMFBR shielding experiments, including an examination of streaming through annular slits and measurement of secondary gamma-ray production in reinforced concrete

  8. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  9. Experimental Verification of Boyle's Law and the Ideal Gas Law

    Science.gov (United States)

    Ivanov, Dragia Trifonov

    2007-01-01

    Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…

  10. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  11. Experience of Integrated Safeguards Approach for Large-scale Hot Cell Laboratory

    International Nuclear Information System (INIS)

    Miyaji, N.; Kawakami, Y.; Koizumi, A.; Otsuji, A.; Sasaki, K.

    2010-01-01

    The Japan Atomic Energy Agency (JAEA) has been operating a large-scale hot cell laboratory, the Fuels Monitoring Facility (FMF), located near the experimental fast reactor Joyo at the Oarai Research and Development Center (JNC-2 site). The FMF conducts post irradiation examinations (PIE) of fuel assemblies irradiated in Joyo. The assemblies are disassembled and non-destructive examinations, such as X-ray computed tomography tests, are carried out. Some of the fuel pins are cut into specimens and destructive examinations, such as ceramography and X-ray micro analyses, are performed. Following PIE, the tested material, in the form of a pin or segments, is shipped back to a Joyo spent fuel pond. In some cases, after reassembly of the examined irradiated fuel pins is completed, the fuel assemblies are shipped back to Joyo for further irradiation. For the IAEA to apply the integrated safeguards approach (ISA) to the FMF, a new verification system on material shipping and receiving process between Joyo and the FMF has been established by the IAEA under technical collaboration among the Japan Safeguard Office (JSGO) of MEXT, the Nuclear Material Control Center (NMCC) and the JAEA. The main concept of receipt/shipment verification under the ISA for JNC-2 site is as follows: under the IS, the FMF is treated as a Joyo-associated facility in terms of its safeguards system because it deals with the same spent fuels. Verification of the material shipping and receiving process between Joyo and the FMF can only be applied to the declared transport routes and transport casks. The verification of the nuclear material contained in the cask is performed with the method of gross defect at the time of short notice random interim inspections (RIIs) by measuring the surface neutron dose rate of the cask, filled with water to reduce radiation. The JAEA performed a series of preliminary tests with the IAEA, the JSGO and the NMCC, and confirmed from the standpoint of the operator that this

  12. Verification experiment on the downblending of high enriched uranium (HEU) at the Portsmouth Gaseous Diffusion Plant. Digital video surveillance of the HEU feed stations

    International Nuclear Information System (INIS)

    Martinez, R.L.; Tolk, K.; Whiting, N.; Castleberry, K.; Lenarduzzi, R.

    1998-01-01

    As part of a Safeguards Agreement between the US and the International Atomic Energy Agency (IAEA), the Portsmouth Gaseous Diffusion Plant, Piketon, Ohio, was added to the list of facilities eligible for the application of IAEA safeguards. Currently, the facility is in the process of downblending excess inventory of HEU to low enriched uranium (LEU) from US defense related programs for commercial use. An agreement was reached between the US and the IAEA that would allow the IAEA to conduct an independent verification experiment at the Portsmouth facility, resulting in the confirmation that the HEU was in fact downblended. The experiment provided an opportunity for the DOE laboratories to recommend solutions/measures for new IAEA safeguards applications. One of the measures recommended by Sandia National Laboratories (SNL), and selected by the IAEA, was a digital video surveillance system for monitoring activity at the HEU feed stations. This paper describes the SNL implementation of the digital video system and its integration with the Load Cell Based Weighing System (LCBWS) from Oak Ridge National Laboratory (ORNL). The implementation was based on commercially available technology that also satisfied IAEA criteria for tamper protection and data authentication. The core of the Portsmouth digital video surveillance system was based on two Digital Camera Modules (DMC-14) from Neumann Consultants, Germany

  13. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services

    Directory of Open Access Journals (Sweden)

    Alexandre Pinheiro

    2018-03-01

    Full Text Available Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  14. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services.

    Science.gov (United States)

    Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-03-02

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  15. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  16. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  17. The regional energy integration: the latin-american experiences

    International Nuclear Information System (INIS)

    2003-01-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  18. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    Science.gov (United States)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  19. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  20. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  1. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  2. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state-of-the-art Horn clause verification tools....

  3. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  4. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  5. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  6. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  7. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  8. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence

  9. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  10. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  11. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  12. Verification on reliability of heat exchanger for primary cooling system

    International Nuclear Information System (INIS)

    Koike, Sumio; Gorai, Shigeru; Onoue, Ryuji; Ohtsuka, Kaoru

    2010-07-01

    Prior to the JMTR refurbishment, verification on reliability of the heat exchangers for primary cooling system was carried out to investigate an integrity of continuously use component. From a result of the significant corrosion, decrease of tube thickness, crack were not observed on the heat exchangers, and integrity of heat exchangers were confirmed. In the long terms usage of the heat exchangers, the maintenance based on periodical inspection and a long-term maintenance plan is scheduled. (author)

  13. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  14. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  15. Verification of kinetic parameters of coupled fast-thermal core HERBE

    International Nuclear Information System (INIS)

    Pesic, M.; Marinkovic, P.; Milosevic, M.; Nikolic, D.; Zavaljevski, N.; Milovanovic, S.; Ljubenov, V.

    1997-03-01

    The HERBE system is a new coupled fast-thermal core constructed in 1989 at the RB critical heavy water assembly at the VINCA Institute. It was designed with the aim to improve experimental possibilities in fast neutron fields and for experimental verification of reactor design-oriented methods. This paper overviews experiments for kinetic parameters verification carried out at HERBE system. Their short description and comparison of experimental and calculation results are included. A brief introduction to the computer codes used in the calculations is presented too. (author)

  16. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    Science.gov (United States)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  17. Design and Realization of Avionics Integration Simulation System Based on RTX

    Directory of Open Access Journals (Sweden)

    Wang Liang

    2016-01-01

    Full Text Available Aircraft avionics system becoming more and more complicated, it is too hard to test and verify real avionics systems. A design and realization method of avionics integration simulation system based on RTX was brought forward to resolve the problem. In this simulation system, computer software and hardware resources were utilized entirely. All kinds of aircraft avionics system HIL (hardware-in-loop simulations can be implemented in this platform. The simulation method provided the technical foundation of testing and verifying real avionics system. The research has recorded valuable data using the newly-developed method. The experiment results prove that the avionics integration simulation system was used well in some helicopter avionics HIL simulation experiment. The simulation experiment results provided the necessary judgment foundation for the helicopter real avionics system verification.

  18. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services †

    Science.gov (United States)

    2018-01-01

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641

  19. Sustaining a verification regime in a nuclear weapon-free world. VERTIC research report no. 4

    International Nuclear Information System (INIS)

    Moyland, S. van

    1999-01-01

    Sustaining high levels of commitment to and enthusiasm for the verification regime in a nuclear weapon-free world (NWFW) would be a considerable challenge, but the price of failure would be high. No verification system for a complete ban on a whole of weapon of mass destruction (WMD) has been in existence long enough to provide a precedent or the requisite experience. Nevertheless, lessons from the International Atomic Energy Agency's (IAEA) nuclear safeguards system are instructive. A potential problem over the long haul is the gradual erosion of the deterrent effect of verification that may result from the continual overlooking of minor instances of non-compliance. Flaws in the verification system must be identified and dealt with early lest they also corrode the system. To achieve this the verification organisation's inspectors and analytical staff will need sustained support, encouragement, resources and training. In drawing attention to weaknesses, they must be supported by management and at the political level. The leaking of sensitive information, either industrial or military, by staff of the verification regime is a potential problem. 'Managed access' techniques should be constantly examined and improved. The verification organisation and states parties will need to sustain close co-operation with the nuclear and related industries. Frequent review mechanisms must be established. States must invest time and effort to make them effective. Another potential problem is the withering of resources for sustained verification. Verification organisations tend to be pressured by states to cut or last least cap costs, even if the verification workload increases. The verification system must be effective as knowledge and experience allows. The organisation will need continuously to update its scientific methods and technology. This requires in-house resources plus external research and development (R and D). Universities, laboratories and industry need incentives to

  20. Verification of Liveness Properties on Closed Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Andersen, Mathias; Larsen, Heine G.; Srba, Jiri

    2012-01-01

    Verification of closed timed models by explicit state-space exploration methods is an alternative to the wide-spread symbolic techniques based on difference bound matrices (DBMs). A few experiments found in the literature confirm that for the reachability analysis of timed automata explicit...... techniques can compete with DBM-based algorithms, at least for situations where the constants used in the models are relatively small. To the best of our knowledge, the explicit methods have not yet been employed in the verification of liveness properties in Petri net models extended with time. We present...

  1. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  2. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  3. A Method to Integrate GMM, SVM and DTW for Speaker Recognition

    Directory of Open Access Journals (Sweden)

    Ing-Jr Ding

    2014-01-01

    Full Text Available This paper develops an effective and efficient scheme to integrate Gaussian mixture model (GMM, support vector machine (SVM, and dynamic time wrapping (DTW for automatic speaker recognition. GMM and SVM are two popular classifiers for speaker recognition applications. DTW is a fast and simple template matching method, and it is frequently seen in applications of speech recognition. In this work, DTW does not play a role to perform speech recognition, and it will be employed to be a verifier for verification of valid speakers. The proposed combination scheme of GMM, SVM and DTW, called SVMGMM-DTW, for speaker recognition in this study is a two-phase verification process task including GMM-SVM verification of the first phase and DTW verification of the second phase. By providing a double check to verify the identity of a speaker, it will be difficult for imposters to try to pass the security protection; therefore, the safety degree of speaker recognition systems will be largely increased. A series of experiments designed on door access control applications demonstrated that the superiority of the developed SVMGMM-DTW on speaker recognition accuracy.

  4. Verification of RRC Ki code package for neutronic calculations of WWER core with GD

    International Nuclear Information System (INIS)

    Aleshin, S.S.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Pavlov, V.I.; Pavlovitchev, A.M.; Sidorenko, V.D.; Tsvetkov, V.M.

    2001-01-01

    The report presented is concerned with verification results of TVS-M/PERMAK-A/BIPR-7A code package for WWERs neutronic calculation as applied to calculation of systems containing U-GD pins. The verification is based on corresponded benchmark calculations, data critical experiments and on operation data obtained WWER units with Gd. The comparison results are discussed (Authors)

  5. INTEGRATION POLICY TOWARDS IMMIGRANTS: CURRENT EXPERIENCE

    Directory of Open Access Journals (Sweden)

    Nadiia Bureiko

    2012-03-01

    Full Text Available In the contemporary world the intensity of the immigration movements is constantly increasing. Countries which experience great immigrant flows are facing numerous problems which should be solved. The article studies the current immigration flows in EU countries, the United States of America and Canada and presents three main models of integration policy towards immigrants – political assimilation, functional integration and multicultural model. Separate models are distinguished for the Muslims’ integration. The author examines the peculiarities of every model and examines the conclusions provided by the Migrant Integration Policy Index (MIPEX concerning the situation of the immigrants’ integration in 31 countries in 2011. Among all the policy indicators the first that are defined are as follows: political participation, education, labour market mobility and anti-discrimination. The situation with immigrants’ integration in Ukraine is also studied as it is gaining a great attention of the authorities and the public. The measures and practical steps done regarding this situation in Ukraine in recent years are analyzed using the information offered by the State Migration Service of Ukraine.

  6. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  7. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  8. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  9. Experiences of giving and receiving care in traumatic brain injury: An integrative review.

    Science.gov (United States)

    Kivunja, Stephen; River, Jo; Gullick, Janice

    2018-04-01

    To synthesise the literature on the experiences of giving or receiving care for traumatic brain injury for people with traumatic brain injury, their family members and nurses in hospital and rehabilitation settings. Traumatic brain injury represents a major source of physical, social and economic burden. In the hospital setting, people with traumatic brain injury feel excluded from decision-making processes and perceive impatient care. Families describe inadequate information and support for psychological distress. Nurses find the care of people with traumatic brain injury challenging particularly when experiencing heavy workloads. To date, a contemporary synthesis of the literature on people with traumatic brain injury, family and nurse experiences of traumatic brain injury care has not been conducted. Integrative literature review. A systematic search strategy guided by the PRISMA statement was conducted in CINAHL, PubMed, Proquest, EMBASE and Google Scholar. Whittemore and Knafl's (Journal of Advanced Nursing, 52, 2005, 546) integrative review framework guided data reduction, data display, data comparison and conclusion verification. Across the three participant categories (people with traumatic brain injury/family members/nurses) and sixteen subcategories, six cross-cutting themes emerged: seeking personhood, navigating challenging behaviour, valuing skills and competence, struggling with changed family responsibilities, maintaining productive partnerships and reflecting on workplace culture. Traumatic brain injury creates changes in physical, cognitive and emotional function that challenge known ways of being in the world for people. This alters relationship dynamics within families and requires a specific skill set among nurses. Recommendations include the following: (i) formal inclusion of people with traumatic brain injury and families in care planning, (ii) routine risk screening for falls and challenging behaviour to ensure that controls are based on

  10. Sustainable Development Impacts of Nationally Appropriate Mitigation Actions: An integrated approach to assessment of co-benefits based on experience with the Clean Development Mechanism

    DEFF Research Database (Denmark)

    Olsen, Karen Holm

    to assess the SD impacts of NAMAs. This paper argues for a new integrated approach to asses NAMAs' SD impacts that consists of SD indicators, procedures for stakeholder involvement and safeguards against negative impacts. The argument is based on a review of experience with the CDM’s contribution to SD......, particularly how a combined process and results approach known from the CDM SD Tool can be applied to develop a strong approach for SD assessment of NAMAs based on a comparison of similarities and differences between NAMAs and CDM. Five elements of a new approach towards assessment of NAMAs SD impacts...... are suggested based on emerging approaches and methodologies for monitoring, reporting and verification (MRV) of greenhouse gas reductions and SD impacts of NAMAs....

  11. VERIFICATION OF THE SENTINEL-4 FOCAL PLANE SUBSYSTEM

    Directory of Open Access Journals (Sweden)

    C. Williges

    2017-05-01

    Full Text Available The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs, one for the UV-VIS spectral range (305 nm … 500 nm, the second for NIR (750 nm … 775 nm. In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM which will also be used for the upcoming Flight Model (FM verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  12. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  13. The Healy Clean Coal Project: Design verification tests

    International Nuclear Information System (INIS)

    Guidetti, R.H.; Sheppard, D.B.; Ubhayakar, S.K.; Weede, J.J.; McCrohan, D.V.; Rosendahl, S.M.

    1993-01-01

    As part of the Healy Clean Coal Project, TRW Inc., the supplier of the advanced slagging coal combustors, has successfully completed design verification tests on the major components of the combustion system at its Southern California test facility. These tests, which included the firing of a full-scale precombustor with a new non-storage direct coal feed system, supported the design of the Healy combustion system and its auxiliaries performed under Phase 1 of the project. Two 350 million BTU/hr combustion systems have been designed and are now ready for fabrication and erection, as part of Phase 2 of the project. These systems, along with a back-end Spray Dryer Absorber system, designed and supplied by Joy Technologies, will be integrated with a Foster Wheeler boiler for the 50 MWe power plant at Healy, Alaska. This paper describes the design verification tests and the current status of the project

  14. Clinical Skills Verification, Formative Feedback, and Psychiatry Residency Trainees

    Science.gov (United States)

    Dalack, Gregory W.; Jibson, Michael D.

    2012-01-01

    Objective: The authors describe the implementation of Clinical Skills Verification (CSV) in their program as an in-training assessment intended primarily to provide formative feedback to trainees, strengthen the supervisory experience, identify the need for remediation of interviewing skills, and secondarily to demonstrating resident competence…

  15. Entanglement verification and its applications in quantum communication

    International Nuclear Information System (INIS)

    Haeseler, Hauke

    2010-01-01

    coherent storage of light, we focus on the storage of squeezed light. This situation requires an extension of our verification procedure to sources of mixed input states. We propose such an extension, and give a detailed analysis of its application to squeezed thermal states, displaced thermal states and mixed qubit states. This is supplemented by finding the optimal entanglement-breaking channels for each of these situations, which provides us with an indication of the strength of the extension to our entanglement criterion. The subject of Chapter 6 is also the benchmarking of quantum memory or teleportation experiments. Considering a number of recently published benchmark criteria, we investigate the question which one is most useful to actual experiments. We first compare the different criteria for typical settings and sort them according to their resilience to excess noise. Then, we introduce a further improvement to the Expectation Value Matrix method, which results in the desired optimal benchmark criterion. Finally, we investigate naturally occurring phase fluctuations and find them to further simplify the implementation of our criterion. Thus, we formulate the first truly useful way of validating experiments for the quantum storage or transmission of light. (orig.)

  16. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  17. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  18. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  19. Trends in integrated circuit design for particle physics experiments

    International Nuclear Information System (INIS)

    Atkin, E V

    2017-01-01

    Integrated circuits are one of the key complex units available to designers of multichannel detector setups. A whole number of factors makes Application Specific Integrated Circuits (ASICs) valuable for Particle Physics and Astrophysics experiments. Among them the most important ones are: integration scale, low power dissipation, radiation tolerance. In order to make possible future experiments in the intensity, cosmic, and energy frontiers today ASICs should provide new level of functionality at a new set of constraints and trade-offs, like low-noise high-dynamic range amplification and pulse shaping, high-speed waveform sampling, low power digitization, fast digital data processing, serialization and data transmission. All integrated circuits, necessary for physical instrumentation, should be radiation tolerant at an earlier not reached level (hundreds of Mrad) of total ionizing dose and allow minute almost 3D assemblies. The paper is based on literary source analysis and presents an overview of the state of the art and trends in nowadays chip design, using partially own ASIC lab experience. That shows a next stage of ising micro- and nanoelectronics in physical instrumentation. (paper)

  20. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  1. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  2. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  3. Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Ricketts, Kate, E-mail: k.ricketts@ucl.ac.uk [Division of Surgery and Interventional Sciences, University College London, London (United Kingdom); Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading (United Kingdom); Navarro, Clara; Lane, Katherine; Blowfield, Claire; Cotten, Gary; Tomala, Dee; Lord, Christine; Jones, Joanne; Adeyemi, Abiodun [Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading (United Kingdom)

    2016-08-01

    Purpose: To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Methods and Materials: Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. The deviation of the results was used to inform future action levels. Results: Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of Δ{sub D} of 0 ± 5 cGy or %Δ{sub D} of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (−2 ± 5 cGy) or summed field percentage dose difference (−6% ± 7%). Conclusions: The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit

  4. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  5. An Integrated Approach to Conversion, Verification, Validation and Integrity of AFRL Generic Engine Model and Simulation (Postprint)

    Science.gov (United States)

    2007-02-01

    and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V

  6. Age verification cards fail to fully prevent minors from accessing tobacco products.

    Science.gov (United States)

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  7. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  8. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  9. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  10. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  11. Design and implementation of embedded hardware accelerator for diagnosing HDL-CODE in assertion-based verification environment

    Directory of Open Access Journals (Sweden)

    C. U. Ngene

    2013-08-01

    Full Text Available The use of assertions for monitoring the designer’s intention in hardware description language (HDL model is gaining popularity as it helps the designer to observe internal errors at the output ports of the device under verification. During verification assertions are synthesised and the generated data are represented in a tabular forms. The amount of data generated can be enormous depending on the size of the code and the number of modules that constitute the code. Furthermore, to manually inspect these data and diagnose the module with functional violation is a time consuming process which negatively affects the overall product development time. To locate the module with functional violation within acceptable diagnostic time, the data processing and analysis procedure must be accelerated. In this paper a multi-array processor (hardware accelerator was designed and implemented in Virtex6 field programmable gate array (FPGA and it can be integrated into verification environment. The design was captured in very high speed integrated circuit HDL (VHDL. The design was synthesised with Xilinx design suite ISE 13.1 and simulated with Xilinx ISIM. The multi-array processor (MAP executes three logical operations (AND, OR, XOR and a one’s compaction operation on array of data in parallel. An improvement in processing and analysis time was recorded as compared to the manual procedure after the multi-array processor was integrated into the verification environment. It was also found that the multi-array processor which was developed as an Intellectual Property (IP core can also be used in applications where output responses and golden model that are represented in the form of matrices can be compared for searching, recognition and decision-making.

  12. A research on the verification of models used in the computational codes and the uncertainty reduction method for the containment integrity evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Hwan; Seo, Kyoung Woo [POSTECH, Pohang (Korea, Republic of)

    2001-03-15

    In the probability approach, the calculated CCFPs of all the scenarios were zero, which meant that it was expected that for all the accident scenarios the maximum pressure load induced by DCH was lower than the containment failure pressure obtained from the fragility curve. Thus, it can be stated that the KSNP containment is robust to the DCH threat. And uncertainty of computer codes used to be two (deterministic and probabilistic) approaches were reduced by the sensitivity tests and the research with the verification and comparison of the DCH models in each code. So, this research was to evaluate synthetic result of DCH issue and expose accurate methodology to assess containment integrity about operating PWR in Korea.

  13. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  14. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  15. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  16. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  18. Further optimisations of constant Q cepstral processing for integrated utterance and text-dependent speaker verification

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md

    2016-01-01

    Many authentication applications involving automatic speaker verification (ASV) demand robust performance using short-duration, fixed or prompted text utterances. Text constraints not only reduce the phone-mismatch between enrollment and test utterances, which generally leads to improved performa...

  19. Integrating conceptualizations of experience into the interaction design process

    DEFF Research Database (Denmark)

    Dalsgaard, Peter

    2010-01-01

    From a design perspective, the increasing awareness of experiential aspects of interactive systems prompts the question of how conceptualizations of experience can inform and potentially be integrated into the interaction design process. This paper presents one approach to integrating theoretical...

  20. Experiences of technology integration in home care nursing.

    Science.gov (United States)

    Johnson, K A; Valdez, R S; Casper, G R; Kossman, S P; Carayon, P; Or, C K L; Burke, L J; Brennan, P F

    2008-11-06

    The infusion of health care technologies into the home leads to substantial changes in the nature of work for home care nurses and their patients. Nurses and nursing practice must change to capitalize on these innovations. As part of a randomized field experiment evaluating web-based support for home care of patients with chronic heart disease, we engaged nine nurses in a dialogue about their experience integrating this modification of care delivery into their practice. They shared their perceptions of the work they needed to do and their perceptions and expectations for patients and themselves in using technologies to promote and manage self-care. We document three overarching themes that identify preexisting factors that influenced integration or represent the consequences of technology integration into home care: doing tasks differently, making accommodations in the home for devices and computers, and being mindful of existing expectations and skills of both nurses and patients.

  1. Selection and verification of safety parameters in safety parameter display system for nuclear power plants

    International Nuclear Information System (INIS)

    Zhang Yuangfang

    1992-02-01

    The method and results for safety parameter selection and its verification in safety parameter display system of nuclear power plants are introduced. According to safety analysis, the overall safety is divided into six critical safety functions, and a certain amount of safety parameters which can represent the integrity degree of each function and the causes of change are strictly selected. The verification of safety parameter selection is carried out from the view of applying the plant emergency procedures and in the accident man oeuvres on a full scale nuclear power plant simulator

  2. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  3. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  4. Multi-Disciplinary Research Experiences Integrated with Industry –Field Experiences

    Directory of Open Access Journals (Sweden)

    Suzanne Lunsford

    2015-10-01

    Full Text Available The purpose of this environmentally inquiry-based lab was to allow the students to engage into real-world concepts that integrate industry setting (Ohio Aggregate Industrial Mineral Association with the academia setting. Our students are engaged into a field trip where mining occurs to start the problem based learning of how the heavy metals leak in the mining process. These heavy metals such as lead and indium in the groundwater are a serious concern for the environment (Environmental Protection Agency from the mining process. The field experiences at the mining process assist in building our students interest in developing sensors to detect heavy metals of concern such as lead and indium simultaneously by a unique electrochemistry technique called Square Wave Anodic Stripping Voltammetry (SWASV. The field experience assists building the students interest in real –world application and what qualities do they want the electrochemical sensor to possess to be successful for real world usage. During the field trip the students are engaged into learning novel instrumentation such as an SEM (Scanning Electron Microscope to study the working electrode sensor developed to understand the sensor surface morphology properties better as well. The integration of industry setting with academia has been a positive experience for our students that has allowed their understanding of real-world science research needs to succeed in an industrial setting of research.

  5. A feasible method for clinical delivery verification and dose reconstruction in tomotherapy

    International Nuclear Information System (INIS)

    Kapatoes, J.M.; Olivera, G.H.; Ruchala, K.J.; Smilowitz, J.B.; Reckwerdt, P.J.; Mackie, T.R.

    2001-01-01

    Delivery verification is the process in which the energy fluence delivered during a treatment is verified. This verified energy fluence can be used in conjunction with an image in the treatment position to reconstruct the full three-dimensional dose deposited. A method for delivery verification that utilizes a measured database of detector signal is described in this work. This database is a function of two parameters, radiological path-length and detector-to-phantom distance, both of which are computed from a CT image taken at the time of delivery. Such a database was generated and used to perform delivery verification and dose reconstruction. Two experiments were conducted: a simulated prostate delivery on an inhomogeneous abdominal phantom, and a nasopharyngeal delivery on a dog cadaver. For both cases, it was found that the verified fluence and dose results using the database approach agreed very well with those using previously developed and proven techniques. Delivery verification with a measured database and CT image at the time of treatment is an accurate procedure for tomotherapy. The database eliminates the need for any patient-specific, pre- or post-treatment measurements. Moreover, such an approach creates an opportunity for accurate, real-time delivery verification and dose reconstruction given fast image reconstruction and dose computation tools

  6. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  7. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  8. Technology choices for the Integrated Beam Experiment (IBX)

    Energy Technology Data Exchange (ETDEWEB)

    Leitner, M.A.; Celata, C.M.; Lee, E.P.; Sabbi, G.; Waldron, W.L.; Barnard, J.J.

    2002-10-31

    Over the next three years the research program of the Heavy Ion Fusion Virtual National Laboratory (HIF-VNL), a collaboration among LBNL, LLNL, and PPPL, is focused on separate scientific experiments in the injection, transport and focusing of intense heavy ion beams at currents from 100 mA to 1 A. As a next major step in the HIF-VNL program, we aim for a complete ''source-to-target'' experiment, the Integrated Beam Experiment (IBX). By combining the experience gained in the current separate beam experiments IBX would allow the integrated scientific study of the evolution of a single heavy ion beam at high current ({approx}1 A) through all sections of a possible heavy ion fusion accelerator: the injection, acceleration, compression, and beam focusing. This paper describes the main parameters and technology choices of the planned IBX experiment. IBX will accelerate singly charged potassium or argon ion beams up to 10 MeV final energy and a longitudinal beam compression ratio of 10, resulting in a beam current at target of more than 10 Amperes. Different accelerator cell design options are described in detail: Induction cores incorporating either room temperature pulsed focusing-magnets or superconducting magnets.

  9. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  10. Research activities on radioecology for the past ten years: Experiments and modeling at KAERI

    International Nuclear Information System (INIS)

    Lee, H.; Choi, H.J.; Yu, D.H.; Kang, H.S.; Lim, K.M.; Choi, Y.H.; Lee, C.W.

    2003-01-01

    Experiments in a greenhouse have been conducted to evaluate the effects of radionuclides on various plants. Transfer factors, translocation factors, and other parameters have been measured particularly for major foodstuffs, such as rice and vegetables. A computer code was established to assess the environment in case of acute radionuclide release by accident. Verification and sensitivity analysis have been carried out for the integrity of this code. (author)

  11. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  12. Further development and verification of the calculating programme Felix for the simulation of criticality excursions

    International Nuclear Information System (INIS)

    Weber, J.; Denk, W.

    1985-01-01

    An improved version of the FELIX programme was applied to varify excursion experiments 01, 03 through 07, and 13. The correspondence of experiments and verification was good. Programme points to be further developed are shown. (orig.) [de

  13. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  14. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  15. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  16. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  17. Two high accuracy digital integrators for Rogowski current transducers

    Science.gov (United States)

    Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua

    2014-01-01

    The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.

  18. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  19. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  20. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  1. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  2. Striving to be known by significant others: automatic activation of self-verification goals in relationship contexts.

    Science.gov (United States)

    Kraus, Michael W; Chen, Serena

    2009-07-01

    Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  3. R6 assessment of IPIRG-2 programme experiments

    International Nuclear Information System (INIS)

    Sharples, J.K.; France, C.C.; Budden, P.J.

    1999-01-01

    The International Piping Integrity Research Group (IPIRG) Programme was an international group programme managed by the US Nuclear Regulatory Commission (US NRC) and was aimed at developing a better understanding of the fracture behaviour of pressurised nuclear plant piping The second stage IPIRG experiments (IPIRG-2) included the development of data for the verification of fracture analyses for cracked pipes and fittings subjected to dynamic and/or cyclic load histories. This paper describes the results of work undertaken on analysing selected IPIRG-2 experiments using the UK R6 fracture assessment methodology. The level of conservatism of the R6 methodology is presented by comparing predicted applied bending moments at initiation and instability with the experimentally determined values. (author)

  4. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  5. ORNL fusion reactor shielding integral experiments

    International Nuclear Information System (INIS)

    Santoro, R.T.; Alsmiller, R.G. Jr.; Barnes, J.M.; Chapman, G.T.

    1980-01-01

    Integral experiments that measure the neutron and gamma-ray energy spectra resulting from the attenuation of approx. 14 MeV T(D,n) 4 He reaction neutrons in laminated slabs of stainless steel type 304, borated polyethylene, and a tungsten alloy (Hevimet) and from neutrons streaming through a 30-cm-diameter iron duct (L/D = 3) imbedded in a concrete shield have been performed. The facility, the NE-213 liquid scintillator detector system, and the experimental techniques used to obtain the measured data are described. The two-dimensional discrete ordinates radiation transport codes, calculational models, and nuclear data used in the analysis of the experiments are reviewed

  6. The integrated circuit IC EMP transient state disturbance effect experiment method investigates

    International Nuclear Information System (INIS)

    Li Xiaowei

    2004-01-01

    Transient state disturbance characteristic study on the integrated circuit, IC, need from its coupling path outset. Through cable (aerial) coupling, EMP converts to an pulse current voltage and results in the impact to the integrated circuit I/O orifice passing the cable. Aiming at the armament system construction feature, EMP effect to the integrated circuit, IC inside the system is analyzed. The integrated circuit, IC EMP effect experiment current injection method is investigated and a few experiments method is given. (authors)

  7. Interpolant Tree Automata and their Application in Horn Clause Verification

    Directory of Open Access Journals (Sweden)

    Bishoksan Kafle

    2016-07-01

    Full Text Available This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this paper. The role of an interpolant tree automaton is to provide a generalisation of a spurious counterexample during refinement, capturing a possibly infinite set of spurious counterexample traces. In our approach these traces are then eliminated using a transformation of the Horn clauses. We compare this approach with two other methods; one of them uses interpolant tree automata in an algorithm for trace abstraction and refinement, while the other uses abstract interpretation over the domain of convex polyhedra without the generalisation step. Evaluation of the results of experiments on a number of Horn clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead.

  8. Reverse Engineering Integrated Circuits Using Finite State Machine Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oler, Kiri J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Carl H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-12

    In this paper, we present a methodology for reverse engineering integrated circuits, including a mathematical verification of a scalable algorithm used to generate minimal finite state machine representations of integrated circuits.

  9. Planning for an Integrated Research Experiment

    International Nuclear Information System (INIS)

    Barnard, J.J.; Ahle, L.E.; Bangerter, R.O.; Bieniosek, F.M.; Celata, C.M.; Faltens, A.; Friedman, A.; Grote, D.P.; Haber, I.; Henestroza, E.; Kishek, R.A.; Hoon, M.J.L. de; Karpenko, V.P.; Kwan, J.W.; Lee, E.P.; Logan, B.G.; Lund, S.M.; Meier, W.R.; Molvik, A.W.; Sangster, T.C.; Seidl, P.A.; Sharp, W.M.

    2000-01-01

    The authors describe the goals and research program leading to the Heavy Ion Integrated Research Experiment (IRE). They review the basic constraints which lead to a design and give examples of parameters and capabilities of an IRE. We also show design tradeoffs generated by the systems code IBEAM. A multi-pronged Phase 1 research effort is laying the groundwork for the Integrated Research Experiment. Experiment, technology development, theory, simulation, and systems studies are all playing major roles in this Phase I research. The key research areas are: (1) Source and injector (for investigation of a high brightness, multiple beam, low cost injector); (2) High current transport (to examine effects at full driver-scale line charge density, including the maximization of the beam filling-factor and control of electrons); (3) Enabling technology development (low cost and high performance magnetic core material, superconducting magnetic quadrupole arrays, insulators, and pulsers); and (4) Beam simulations and theory (for investigations of beam matching, specification of accelerator errors, studies of emittance growth, halo, and bunch compression, in the accelerator, and neutralization methods, stripping effects, spot size minimization in the chamber); and (5) Systems optimization (minimization of cost and maximization of pulse energy and beam intensity). They have begun the process of designing, simulating, and optimizing the next major heavy-ion induction accelerator, the IRE. This accelerator facility will, in turn, help provide the basis to proceed to the next step in the development of IFE as an attractive source of fusion energy

  10. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  11. Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)

    Science.gov (United States)

    2016-03-01

    up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive

  12. Viability Study for an Unattended UF_6 Cylinder Verification Station: Phase I Final Report

    International Nuclear Information System (INIS)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.; Branney, Sean; McDonald, Benjamin S.; Webster, Jennifer B.; Zalavadia, Mital A.; Todd, Lindsay C.; Kulisek, Jonathan A.; Nordquist, Heather; Deshmukh, Nikhil S.; Stewart, Scott

    2016-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, "2"3"5U mass, total uranium mass and identification for all declared UF_6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 ''typical'' Type 30B cylinders, and the viability of an ''NDA Fingerprint'' concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument

  13. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  14. Software for the Integration of Multiomics Experiments in Bioconductor.

    Science.gov (United States)

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    Science.gov (United States)

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  16. BDI: the Cadarache data bank for LMBFR integral experiment data

    International Nuclear Information System (INIS)

    Rimpault, G.; Reynaud, G.

    1986-09-01

    The Integral Data Bank is part of the procedure to create the so-called neutronic formulaire with which every design calculation is performed with associated uncertainty. A modern way to store the integral data has been set up in order to handle and recalculate easily with a standard procedure (fig. 1) each experimental programme. A direct access way to read the data allows an automatic way to obtain the calculation/experiment discrepancies associated with a particular data base. The BDI has proved to be fully operational and has been used with the new nuclear data file JEF1. In the present version of the BDI more than 140 experiments (critical mass, spectrum indexes, buckling, etc...) both from MASURCA and SNEAK critical experiments are documented and stored in an easy-to-retrieve from. Also included are irradiation experiments in PHENIX and the STEK fission product related experiments. Future plans of development concern reactivity measurements in critical assemblies, irradiation experiments and start-up experiments of SUPER PHENIX and ZEBRA critical experiments

  17. Methods of Verification, Accountability and Control of Special Nuclear Material

    International Nuclear Information System (INIS)

    Stewart, J.E.

    1999-01-01

    This session demonstrates nondestructive assay (NDA) measurement, surveillance and analysis technology required to protect, control and account (MPC and A) for special nuclear materials (SNM) in sealed containers. These measurements, observations and analyses comprise state-of-the art, strengthened, SNM safeguards systems. Staff member specialists, actively involved in research, development, training and implementation worldwide, will present six NDA verification systems and two software tools for integration and analysis of facility MPC and A data

  18. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  19. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  20. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  1. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  2. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  3. Certificateless Public Auditing Protocol with Constant Verification Time

    Directory of Open Access Journals (Sweden)

    Dongmin Kim

    2017-01-01

    Full Text Available To provide the integrity of outsourced data in the cloud storage services, many public auditing schemes which allow a user to check the integrity of the outsourced data have been proposed. Since most of the schemes are constructed on Public Key Infrastructure (PKI, they suffer from several concerns like management of certificates. To resolve the problems, certificateless public auditing schemes also have been studied in recent years. In this paper, we propose a certificateless public auditing scheme which has the constant-time verification algorithm. Therefore, our scheme is more efficient than previous certificateless public auditing schemes. To prove the security of our certificateless public auditing scheme, we first define three formal security models and prove the security of our scheme under the three security models.

  4. Developing a revenue integrity improvement plan.

    Science.gov (United States)

    Banks, Kate

    2010-11-01

    A revenue integrity plan should address five key areas: Accuracy of patient information. Verification of payer information and policies. Accuracy of documentation. Processing of claims. Accuracy of payment.

  5. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  6. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  7. Post-test simulation and analysis of the second full scale CHAN 28-element experiment (validations of CHAN-II (MOD 6) against experiments)

    Energy Technology Data Exchange (ETDEWEB)

    Bayoumi, M H; Muir, W C [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    An experimental program, the CHAN Thermal Chemical Experimental Program, has been setup at WNRE under COG/CANDEV to assess and verify the physical and mathematical models of the CHAN codes. The program has been progressing from studying separate effects in single-element experiments to a full integrated mode in a CANDU 28-element bundle geometry. The CHAN-II series codes are used in the licensing analysis of CANDU reactors. The basic code provides an efficient tool to predict the thermal response of a fuel channel during postulated loss-of-coolant accidents (LOCA) with and without a loss of emergency coolant injection (LOECI) in which the transport of heat by convection is greatly reduced. The code models the progression of the event including fuel channel geometry deformation due to severe overheating. It is the main objective of this paper to discuss further verification of the CHAN-II (MOD 6) computer code against the second full scale 28-element experiment performed at WNRE under COG/CANDEV, designed to represent a Pickering type bundle geometry. The main models and assumptions used in the code will be briefly described. The objective of the experiments is to provide data for the assessment of the physical and mathematical models of the CHAN codes and produce data for code verification under integrated conditions with significant hydrogen production and flow rates similar to the LOCA/LOECI scenario. The issue of whether the Zr/steam reaction is sustainable in a full bundle geometry at elevated temperatures is also examined. A comparison between the predictions of CHAN-II (MOD 6) and the experimental results is discussed. (author).12 refs., 17 figs.

  8. Post-test simulation and analysis of the second full scale CHAN 28-element experiment (validations of CHAN-II (MOD 6) against experiments)

    International Nuclear Information System (INIS)

    Bayoumi, M.H.; Muir, W.C.

    1995-01-01

    An experimental program, the CHAN Thermal Chemical Experimental Program, has been setup at WNRE under COG/CANDEV to assess and verify the physical and mathematical models of the CHAN codes. The program has been progressing from studying separate effects in single-element experiments to a full integrated mode in a CANDU 28-element bundle geometry. The CHAN-II series codes are used in the licensing analysis of CANDU reactors. The basic code provides an efficient tool to predict the thermal response of a fuel channel during postulated loss-of-coolant accidents (LOCA) with and without a loss of emergency coolant injection (LOECI) in which the transport of heat by convection is greatly reduced. The code models the progression of the event including fuel channel geometry deformation due to severe overheating. It is the main objective of this paper to discuss further verification of the CHAN-II (MOD 6) computer code against the second full scale 28-element experiment performed at WNRE under COG/CANDEV, designed to represent a Pickering type bundle geometry. The main models and assumptions used in the code will be briefly described. The objective of the experiments is to provide data for the assessment of the physical and mathematical models of the CHAN codes and produce data for code verification under integrated conditions with significant hydrogen production and flow rates similar to the LOCA/LOECI scenario. The issue of whether the Zr/steam reaction is sustainable in a full bundle geometry at elevated temperatures is also examined. A comparison between the predictions of CHAN-II (MOD 6) and the experimental results is discussed. (author).12 refs., 17 figs

  9. Complementarity of integral and differential experiments for reactor physics purposes

    International Nuclear Information System (INIS)

    Tellier, Henry.

    1981-04-01

    In this paper, the following topics are studied: uranium 238 effective integral; thermal range uranium 238 capture cross section; Americium 242 m capture cross section. The mentioned examples show that differential and integral experiments are both useful to the reactor physicists

  10. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    International Nuclear Information System (INIS)

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  11. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  12. Integrated multiscale biomaterials experiment and modelling: a perspective

    Science.gov (United States)

    Buehler, Markus J.; Genin, Guy M.

    2016-01-01

    Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126

  13. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  14. Verification and clarification of patterns of sensory integrative dysfunction.

    Science.gov (United States)

    Mailloux, Zoe; Mulligan, Shelley; Roley, Susanne Smith; Blanche, Erna; Cermak, Sharon; Coleman, Gina Geppert; Bodison, Stefanie; Lane, Christianne Joy

    2011-01-01

    Building on established relationships between the constructs of sensory integration in typical and special needs populations, in this retrospective study we examined patterns of sensory integrative dysfunction in 273 children ages 4-9 who had received occupational therapy evaluations in two private practice settings. Test results on the Sensory Integration and Praxis Tests, portions of the Sensory Processing Measure representing tactile overresponsiveness, and parent report of attention and activity level were included in the analyses. Exploratory factor analysis identified patterns similar to those found in early studies by Ayres (1965, 1966a, 1966b, 1969, 1972b, 1977, & 1989), namely Visuodyspraxia and Somatodyspraxia, Vestibular and Proprioceptive Bilateral Integration and Sequencing, Tactile and Visual Discrimination, and Tactile Defensiveness and Attention. Findings reinforce associations between constructs of sensory integration and assist with understanding sensory integration disorders that may affect childhood occupation. Limitations include the potential for subjective interpretation in factor analysis and inability to adjust measures available in charts in a retrospective research.

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  17. Fusion Ignition Research Experiment System Integration

    International Nuclear Information System (INIS)

    Brown, T.

    1999-01-01

    The FIRE (Fusion Ignition Research Experiment) configuration has been designed to meet the physics objectives and subsystem requirements in an arrangement that allows remote maintenance of in-vessel components and hands-on maintenance of components outside the TF (toroidal-field) boundary. The general arrangement consists of sixteen wedged-shaped TF coils that surround a free-standing central solenoid (CS), a double-wall vacuum vessel and internal plasma-facing components. A center tie rod is used to help support the vertical magnetic loads and a compression ring is used to maintain wedge pressure in the inboard corners of the TF coils. The magnets are liquid nitrogen cooled and the entire device is surrounded by a thermal enclosure. The double-wall vacuum vessel integrates cooling and shielding in a shape that maximizes shielding of ex-vessel components. The FIRE configuration development and integration process has evolved from an early stage of concept selection to a higher level of machine definition and component details. This paper describes the status of the configuration development and the integration of the major subsystem components

  18. Tracer experiment data sets for the verification of local and meso-scale atmospheric dispersion models including topographic effects

    International Nuclear Information System (INIS)

    Sartori, E.; Schuler, W.

    1992-01-01

    Software and data for nuclear energy applications are acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article proposes more specifically a scheme for acquiring, storing and distributing atmospheric tracer experiment data (ATE) required for verification of atmospheric dispersion models especially the most advanced ones including topographic effects and specific to the local and meso-scale. These well documented data sets will form a valuable complement to the set of atmospheric dispersion computer codes distributed internationally. Modellers will be able to gain confidence in the predictive power of their models or to verify their modelling skills. (au)

  19. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  20. Support of Construction and Verification of Out-of-Pile Fuel Assembly Test Facilities

    International Nuclear Information System (INIS)

    Park, Nam Gyu; Kim, K. T.; Park, J. K.

    2006-12-01

    Fuel assembly and components should be verified by the out-of-pile test facilities in order to load the developed fuel in reactor. Even though most of the component-wise tests have been performed using the facilities in land, the assembly-wise tests has been depended on the oversees' facility due to the lack of the facilities. KAERI started to construct the assembly-wise mechanical/hydraulic test facilities and KNF, as an end user, is supporting the mechanical/hydraulic test facility construction by using the technologies studied through the fuel development programs. The works performed are as follows: - Test assembly shipping container design and manufacturing support - Fuel handling tool design : Gripper, Upper and lower core simulators for assembly mechanical test facility, Internals for assembly hydraulic test facility - Manufacture of test specimens : skeleton and assembly for preliminary functional verification of assembly mechanical/hydraulic test facilities, two assemblies for the verification of assembly mechanical/hydraulic test facilities, Instrumented rod design and integrity evaluation - Verification of assembly mechanical/hydraulic test facilities : test data evaluation

  1. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  2. Support of Construction and Verification of Out-of-Pile Fuel Assembly Test Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam Gyu; Kim, K. T.; Park, J. K. [KNF, Daejeon (Korea, Republic of)] (and others)

    2006-12-15

    Fuel assembly and components should be verified by the out-of-pile test facilities in order to load the developed fuel in reactor. Even though most of the component-wise tests have been performed using the facilities in land, the assembly-wise tests has been depended on the oversees' facility due to the lack of the facilities. KAERI started to construct the assembly-wise mechanical/hydraulic test facilities and KNF, as an end user, is supporting the mechanical/hydraulic test facility construction by using the technologies studied through the fuel development programs. The works performed are as follows: - Test assembly shipping container design and manufacturing support - Fuel handling tool design : Gripper, Upper and lower core simulators for assembly mechanical test facility, Internals for assembly hydraulic test facility - Manufacture of test specimens : skeleton and assembly for preliminary functional verification of assembly mechanical/hydraulic test facilities, two assemblies for the verification of assembly mechanical/hydraulic test facilities, Instrumented rod design and integrity evaluation - Verification of assembly mechanical/hydraulic test facilities : test data evaluation.

  3. The active phasing experiment: Part I. Concept and objectives

    Science.gov (United States)

    Yaitskova, Natalia; Gonte, Frederic; Derie, Frederic; Noethe, Lothar; Surdej, Isabelle; Karban, Robert; Dohlen, Kjetil; Langlois, Maud; Esposito, Simone; Pinna, Enrico; Reyes, Marcos; Montoya, Lusma; Terrett, David

    2006-06-01

    In a framework of ELT design study our group is building an Active Phasing Experiment (APE), the main goals of which is to demonstrate the non-adaptive wavefront control scheme and technology for Extremely Large Telescope (ELT). The experiment includes verification and test of different phasing sensors and integration of a phasing wavefront sensor into a global scheme of segmented telescope active control. After a sufficient number of tests in the laboratory APE will be mounted and tested on sky at a Nasmyth focus of a VLT unit telescope. The paper presents APE as a demonstrator of particular aspects of ELT and provides a general understanding concerning the strategy of segmented mirrors active control.

  4. Pathways for implementing REDD+. Experiences from carbon markets and communities

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X; Ravnkilde Moeller, L; Lopez, T De; Romero, M Z

    2011-07-01

    This issue of Carbon Market Perspectives on 'Pathways for implementing REDD+: Experience from carbon markets and communities' discusses the role of carbon markets in scaling up investments for REDD+ in developing countries. Nine articles authored by experienced negotiators on REDD+, carbon market actors, project developers and other leading experts share experiences and make suggestions on the key elements of a future international REDD+ regime: Architecture and underlying principles, measuring, reporting and verification (MRV), private-sector involvement, the rights of indigenous people and local communities, biodiversity conservation and environmental integrity. The articles are grouped under three main topics: the lessons of existing REDD+ projects; the future REDD+ regime and the role of carbon markets; and experiences and ideas about the involvement of indigenous people and local communities. (LN)

  5. Analytical and Experimental Verification of a Flight Article for a Mach-8 Boundary-Layer Experiment

    Science.gov (United States)

    Richards, W. Lance; Monaghan, Richard C.

    1996-01-01

    Preparations for a boundary-layer transition experiment to be conducted on a future flight mission of the air-launched Pegasus(TM) rocket are underway. The experiment requires a flight-test article called a glove to be attached to the wing of the Mach-8 first-stage booster. A three-dimensional, nonlinear finite-element analysis has been performed and significant small-scale laboratory testing has been accomplished to ensure the glove design integrity and quality of the experiment. Reliance on both the analysis and experiment activities has been instrumental in the success of the flight-article design. Results obtained from the structural analysis and laboratory testing show that all glove components are well within the allowable thermal stress and deformation requirements to satisfy the experiment objectives.

  6. Evaluation of Face Detection Algorithms for the Bank Client Identity Verification

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2017-06-01

    Full Text Available Results of investigation of face detection algorithms efficiency in the banking client visual verification system are presented. The video recordings were made in real conditions met in three bank operating outlets employing a miniature industrial USB camera. The aim of the experiments was to check the practical usability of the face detection method in the biometric bank client verification system. The main assumption was to provide a simplified as much as possible user interaction with the application. Applied algorithms for face detection are described and achieved results of face detection in the real bank environment conditions are presented. Practical limitations of the application based on encountered problems are discussed.

  7. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  8. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    Science.gov (United States)

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  9. You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.

    Science.gov (United States)

    Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J

    2018-03-01

    Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.

  10. Dosimetric pre-treatment verification of IMRT using an EPID; clinical experience

    International Nuclear Information System (INIS)

    Zijtveld, Mathilda van; Dirkx, Maarten L.P.; Boer, Hans C.J. de; Heijmen, Ben J.M.

    2006-01-01

    Background and purpose: In our clinic a QA program for IMRT verification, fully based on dosimetric measurements with electronic portal imaging devices (EPID), has been running for over 3 years. The program includes a pre-treatment dosimetric check of all IMRT fields. During a complete treatment simulation at the linac, a portal dose image (PDI) is acquired with the EPID for each patient field and compared with a predicted PDI. In this paper, the results of this pre-treatment procedure are analysed, and intercepted errors are reported. An automated image analysis procedure is proposed to limit the number of fields that need human intervention in PDI comparison. Materials and methods: Most of our analyses are performed using the γ index with 3% local dose difference and 3 mm distance to agreement as reference values. Scalar parameters are derived from the γ values to summarize the agreement between measured and predicted 2D PDIs. Areas with all pixels having γ values larger than one are evaluated, making decisions based on clinically relevant criteria more straightforward. Results: In 270 patients, the pre-treatment checks revealed four clinically relevant errors. Calculation of statistics for a group of 75 patients showed that the patient-averaged mean γ value inside the field was 0.43 ± 0.13 (1 SD) and only 6.1 ± 6.8% of pixels had a γ value larger than one. With the proposed automated image analysis scheme, visual inspection of images can be avoided in 2/3 of the cases. Conclusion: EPIDs may be used for high accuracy and high resolution routine verification of IMRT fields to intercept clinically relevant dosimetric errors prior to the start of treatment. For the majority of fields, PDI comparison can fully rely on an automated procedure, avoiding excessive workload

  11. Summarisation of construction and commissioning experience for nuclear power integrated test facility

    International Nuclear Information System (INIS)

    Xiao Zejun; Jia Dounan; Jiang Xulun; Chen Bingde

    2003-01-01

    Since the foundation of Nuclear Power Institute of China, it has successively designed various engineering experimental facilities, and constructed nuclear power experimental research base, and accumulated rich construction experiences of nuclear power integrated test facility. The author presents experience on design, construction and commissioning of nuclear power integrated test facility

  12. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  13. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    Science.gov (United States)

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  14. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  15. Finite element program ARKAS: verification for IAEA benchmark problem analysis on core-wide mechanical analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    ''ARKAS'' code verification, with the problems set in the International Working Group on Fast Reactors (IWGFR) Coordinated Research Programme (CRP) on the inter-comparison between liquid metal cooled fast breeder reactor (LMFBR) Core Mechanics Codes, is discussed. The CRP was co-ordinated by the IWGFR around problems set by Dr. R.G. Anderson (UKAEA) and arose from the IWGFR specialists' meeting on The Predictions and Experience of Core Distortion Behaviour (ref. 2). The problems for the verification (''code against code'') and validation (''code against experiment'') were set and calculated by eleven core mechanics codes from nine countries. All the problems have been completed and were solved with the core structural mechanics code ARKAS. Predictions by ARKAS agreed very well with other solutions for the well-defined verification problems. For the validation problems based on Japanese ex-reactor 2-D thermo-elastic experiments, the agreements between measured and calculated values were fairly good. This paper briefly describes the numerical model of the ARKAS code, and discusses some typical results. (author)

  16. An unattended verification station for UF6 cylinders: Field trial findings

    Science.gov (United States)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  17. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  18. Performance Verification for Safety Injection Tank with Fluidic Device

    International Nuclear Information System (INIS)

    Yune, Seok Jeong; Kim, Da Yong

    2014-01-01

    In LBLOCA, the SITs of a conventional nuclear power plant deliver excessive cooling water to the reactor vessel causing the water to flow into the containment atmosphere. In an effort to make it more efficient, Fluidic Device (FD) is installed inside a SIT of Advanced Power Reactor 1400 (APR 1400). FD, a complete passive controller which doesn't require actuating power, controls injection flow rates which are susceptible to a change in the flow resistance inside a vortex chamber of FD. When SIT Emergency Core Cooling (ECC) water level is above the top of the stand pipe, the water enters the vortex chamber through both the top of the stand pipe and the control ports resulting in injection of the water at a large flow rate. When the water level drops below the top of the stand pipe, the water only enters the vortex chamber through the control ports resulting in vortex formation in the vortex chamber and a relatively small flow injection. Performance verification of SIT shall be carried out because SITs play an integral role to mitigate accidents. In this paper, the performance verification method of SIT with FD is presented. In this paper, the equations for calculation of flow resistance coefficient (K) are induced to evaluate on-site performance of APR 1400 SIT with FD. Then, the equations are applied to the performance verification of SIT with FD and good results are obtained

  19. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  20. Beam Extinction Monitoring in the Mu2e Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Prebys, Eric [Fermilab; Bartoszek, Larry [Technicare; Gaponenko, Andrei [Fermilab; Kasper, Peter [Fermilab

    2015-06-01

    The Mu2e Experiment at Fermilab will search for the conversion of a muon to an electron in the field of an atomic nucleus with unprecedented sensitivity. The experiment requires a beam consisting of proton bunches approximately 200ns FW long, separated by 1.7 microseconds, with no out-of-time protons at the 10⁻¹⁰ fractional level. The verification of this level of extinction is very challenging. The proposed technique uses a special purpose spectrometer which will observe particles scattered from the production target of the experiment. The acceptance will be limited such that there will be no saturation effects from the in-time beam. The precise level and profile of the out-of-time beam can then be built up statistically, by integrating over many bunches.

  1. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    Science.gov (United States)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  2. Development and verification of the neutron diffusion solver for the GeN-Foam multi-physics platform

    International Nuclear Information System (INIS)

    Fiorina, Carlo; Kerkar, Nordine; Mikityuk, Konstantin; Rubiolo, Pablo; Pautz, Andreas

    2016-01-01

    Highlights: • Development and verification of a neutron diffusion solver based on OpenFOAM. • Integration in the GeN-Foam multi-physics platform. • Implementation and verification of acceleration techniques. • Implementation of isotropic discontinuity factors. • Automatic adjustment of discontinuity factors. - Abstract: The Laboratory for Reactor Physics and Systems Behaviour at the PSI and the EPFL has been developing in recent years a new code system for reactor analysis based on OpenFOAM®. The objective is to supplement available legacy codes with a modern tool featuring state-of-the-art characteristics in terms of scalability, programming approach and flexibility. As part of this project, a new solver has been developed for the eigenvalue and transient solution of multi-group diffusion equations. Several features distinguish the developed solver from other available codes, in particular: object oriented programming to ease code modification and maintenance; modern parallel computing capabilities; use of general unstructured meshes; possibility of mesh deformation; cell-wise parametrization of cross-sections; and arbitrary energy group structure. In addition, the solver is integrated into the GeN-Foam multi-physics solver. The general features of the solver and its integration with GeN-Foam have already been presented in previous publications. The present paper describes the diffusion solver in more details and provides an overview of new features recently implemented, including the use of acceleration techniques and discontinuity factors. In addition, a code verification is performed through a comparison with Monte Carlo results for both a thermal and a fast reactor system.

  3. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  4. Ispitivanje pogodnosti za održavanje elektronskih uređaja / Demonstration and verification of the maintainability of electronic devices

    Directory of Open Access Journals (Sweden)

    Dragan Ćosović

    2003-11-01

    Full Text Available Ispitivanje pogodnosti za održavanje, u toku razvoja uređaja, sastavni je deo Programa i plana pogodnosti za održavanje. U toku ispitivanja proveravaju se kvantitativni i kvalitativni parametri pogodnosti za održavanje. Ispitivanje kvantitativnih parametara vrši se primenom određenih statističkih metoda. U radu je dat primer primene statističkih metoda za ispitivanje pogodnosti za korektivno održavanje jednog elektronskog uređaja i analizirani su rezultati ispitivanja. / The demonstration and verification of maintainability during the development of a device is an integral part of the Maintainability Plan and Programme. During the demonstration and verification, the quantitative and qualitative parameters of maintainability are checked. The demonstration and verification of the maintainability parameters are done by applying certain statistical methods. An example of applying a statistical methods for testing the maintainability of corrective maintenance of radio is given in the work and the results of the demonstration and verification are analyzed.

  5. Update on Monitoring Technologies for International Safeguards and Fissile Material Verification

    International Nuclear Information System (INIS)

    Croessmann, C. Dennis; Glidewell, Don D.; Mangan, Dennis L.; Smathers, Douglas C.

    1999-01-01

    Monitoring technologies are playing an increasingly important part in international safeguards and fissile material verification. The developments reduce the time an inspector must spend at a site while assuring continuity of knowledge. Monitoring technologies' continued development has produced new seal systems and integrated video surveillance advances under consideration for Trilateral Initiative use. This paper will present recent developments for monitoring systems at Embalse, Argentina, VNHEF, Sarov, Russian, and Savannah River Site, Aiken, South Carolina

  6. International Students' Experiences of Integrating into the Workforce

    Science.gov (United States)

    Nunes, Sarah; Arthur, Nancy

    2013-01-01

    This study explored the integration experiences of 16 international students entering the Canadian workforce using a semistructured interview and constant comparison method. The international students were pursuing immigration to Canada, despite unmet job prospects. Students recommended that employers refrain from discriminating against students…

  7. Progress in heavy ion driven inertial fusion energy: From scaled experiments to the integrated research experiment

    International Nuclear Information System (INIS)

    Barnard, J.J.; Ahle, L.E.; Baca, D.; Bangerter, R.O.; Bieniosek, F.M.; Celata, C.M.; Chacon-Golcher, E.; Davidson, R.C.; Faltens, A.; Friedman, A.; Franks, R.M.; Grote, D.P.; Haber, I.; Henestroza, E.; Hoon, M.J.L. de; Kaganovich, I.; Karpenko, V.P.; Kishek, R.A.; Kwan, J.W.; Lee, E.P.; Logan, B.G.; Lund, S.M.; Meier, W.R.; Molvik, A.W.; Olson, C.; Prost, L.R.; Qin, H.; Rose, D.; Sabbi, G.-L.; Sangster, T.C.; Seidl, P.A.; Sharp, W.M.; Shuman, D.; Vay, J.-L.; Waldron, W.L.; Welch, D.; Yu, S.S.

    2001-01-01

    The promise of inertial fusion energy driven by heavy ion beams requires the development of accelerators that produce ion currents (∼100's Amperes/beam) and ion energies (∼1-10 GeV) that have not been achieved simultaneously in any existing accelerator. The high currents imply high generalized perveances, large tune depressions, and high space charge potentials of the beam center relative to the beam pipe. Many of the scientific issues associated with ion beams of high perveance and large tune depression have been addressed over the last two decades on scaled experiments at Lawrence Berkeley and Lawrence Livermore National Laboratories, the University of Maryland, and elsewhere. The additional requirement of high space charge potential (or equivalently high line charge density) gives rise to effects (particularly the role of electrons in beam transport) which must be understood before proceeding to a large scale accelerator. The first phase of a new series of experiments in Heavy Ion Fusion Virtual National Laboratory (HIF VNL), the High Current Experiments (HCX), is now being constructed at LBNL. The mission of the HCX will be to transport beams with driver line charge density so as to investigate the physics of this regime, including constraints on the maximum radial filling factor of the beam through the pipe. This factor is important for determining both cost and reliability of a driver scale accelerator. The HCX will provide data for design of the next steps in the sequence of experiments leading to an inertial fusion energy power plant. The focus of the program after the HCX will be on integration of all of the manipulations required for a driver. In the near term following HCX, an Integrated Beam Experiment (IBX) of the same general scale as the HCX is envisioned. The step which bridges the gap between the IBX and an engineering test facility for fusion has been designated the Integrated Research Experiment (IRE). The IRE (like the IBX) will provide an

  8. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  9. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  10. Establishment of the Integrated Plant Data Warehouse

    International Nuclear Information System (INIS)

    Oota, Yoshimi; Yoshinaga, Toshiaki

    1999-01-01

    This paper presents 'The Establishment of the Integrated Plant Data Warehouse and Verification Tests on Inter-corporate Electronic Commerce based on the Data Warehouse (PDWH)', one of the 'Shared Infrastructure for the Electronic Commerce Consolidation Project', promoted by the Ministry of International Trade and Industry (MITI) through the Information-Technology Promotion Agency (IPA), Japan. A study group called Japan Plant EC (PlantEC) was organized to perform relevant activities. One of the main activities of plantEC involves the construction of the Integrated (including manufacturers, engineering companies, plant construction companies, and machinery and parts manufacturers, etc.) Data Warehouse which is an essential part of the infrastructure necessary for a system to share information on industrial life cycle ranging from planning/designing to operation/maintenance. Another activity is the utilization of this warehouse for the purpose of conducting verification tests to prove its usefulness. Through these verification tests, PlantEC will endeavor to establish a warehouse with standardized data which can be used for the infrastructure of EC in the process plant industry. (author)

  11. Establishment of the Integrated Plant Data Warehouse

    Energy Technology Data Exchange (ETDEWEB)

    Oota, Yoshimi; Yoshinaga, Toshiaki [Hitachi Works, Hitachi Ltd., hitachi, Ibaraki (Japan)

    1999-07-01

    This paper presents 'The Establishment of the Integrated Plant Data Warehouse and Verification Tests on Inter-corporate Electronic Commerce based on the Data Warehouse (PDWH)', one of the 'Shared Infrastructure for the Electronic Commerce Consolidation Project', promoted by the Ministry of International Trade and Industry (MITI) through the Information-Technology Promotion Agency (IPA), Japan. A study group called Japan Plant EC (PlantEC) was organized to perform relevant activities. One of the main activities of plantEC involves the construction of the Integrated (including manufacturers, engineering companies, plant construction companies, and machinery and parts manufacturers, etc.) Data Warehouse which is an essential part of the infrastructure necessary for a system to share information on industrial life cycle ranging from planning/designing to operation/maintenance. Another activity is the utilization of this warehouse for the purpose of conducting verification tests to prove its usefulness. Through these verification tests, PlantEC will endeavor to establish a warehouse with standardized data which can be used for the infrastructure of EC in the process plant industry. (author)

  12. Verification of FPGA-Signal using the test board which is applied to Safety-related controller

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Youn-Hu; Yoo, Kwanwoo; Lee, Myeongkyun; Yun, Donghwa [SOOSAN ENS, Seoul (Korea, Republic of)

    2016-10-15

    This article aims to provide the verification method for BGA-type FPGA of Programmable Logic Controller (PLC) developed as Safety Class. The logic of FPGA in the control device with Safety Class is the circuit to control overall logic of PLC. Saftety-related PLC must meet the international standard specifications. With this reason, we use V and V according to an international standard in order to secure high reliability and safety. By using this, we are supposed to proceed to a variety of verification courses for extra reliability and safety analysis. In order to have efficient verification of test results, we propose the test using the newly changed BGA socket which can resolve the problems of the conventional socket on this paper. The Verification of processes is divided into verification of Hardware and firmware. That processes are carried out in the unit testing and integration testing. The proposed test method is simple, the effect of cost reductions by batch process. In addition, it is advantageous to measure the signal from the Hi-speed-IC due to its short length of the pins and it was plated with the copper around it. Further, it also to prevent abrasion on the IC ball because it has no direct contact with the PCB. Therefore, it can be actually applied is to the BGA package test and we can easily verify logic as well as easily checking the operation of the designed data.

  13. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  14. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  15. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  16. 1. Introduction. 2. Laboratory experiments. 3. Field experiments. 4. Integrated field-laboratory experiments. 5. Panel recommendations

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Some recommendations for the design of laboratory and field studies in marine radioecology are formulated. The difficulties concerning the comparability of various experimental methods used to measure the fluxes of radionuclides through marine organisms and ecosystems, and also the use of laboratory results to make predictions for the natural environment are discussed. Three working groups were established during the panel meeting, to consider laboratory experiments, field studies, and the design and execution of integrated laboratory and field studies respectively. A number of supporting papers dealing with marine radioecological experiments were presented

  17. Induction Accelerator Technology Choices for the Integrated Beam Experiment (IBX)

    International Nuclear Information System (INIS)

    Leitner, M.A.; Celata, C.M.; Lee, E.P.; Logan, B.G.; Sabbi, G.; Waldron, W.L.; Barnard, J.J.

    2003-01-01

    Over the next three years the research program of the Heavy Ion Fusion Virtual National Laboratory (HIF-VNL), a collaboration among LBNL, LLNL, and PPPL, is focused on separate scientific experiments in the injection, transport and focusing of intense heavy ion beams at currents from 100 mA to 1 A. As a next major step in the HIF-VNL program, we aim for a complete 'source-to-target' experiment, the Integrated Beam Experiment (IBX). By combining the experience gained in the current separate beam experiments IBX would allow the integrated scientific study of the evolution of a single heavy ion beam at high current (∼1 A) through all sections of a possible heavy ion fusion accelerator: the injection, acceleration, compression, and beam focusing.This paper describes the main parameters and technology choices of the planned IBX experiment. IBX will accelerate singly charged potassium or argon ion beams up to 10 MeV final energy and a longitudinal beam compression ratio of 10, resulting in a beam current at target of more than 10 Amperes. Different accelerator cell design options are described in detail: Induction cores incorporating either room temperature pulsed focusing-magnets or superconducting magnets

  18. Latch-up in CMOS integrated circuits

    International Nuclear Information System (INIS)

    Estreich, D.B.; Dutton, R.W.

    1978-04-01

    An analysis is presented of latch-up in CMOS integrated circuits. A latch-up prediction algorithm has been developed and used to evaluate methods to control latch-up. Experimental verification of the algorithm is demonstrated

  19. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Karen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garner, James R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Branney, Sean [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Todd, Lindsay C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nordquist, Heather [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stewart, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-31

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field

  20. Pathways for implementing REDD+. Experiences from carbon markets and communities

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X.; Ravnkilde Moeller, L.; Lopez, T. De; Romero, M.Z.

    2011-07-01

    This issue of Carbon Market Perspectives on 'Pathways for implementing REDD+: Experience from carbon markets and communities' discusses the role of carbon markets in scaling up investments for REDD+ in developing countries. Nine articles authored by experienced negotiators on REDD+, carbon market actors, project developers and other leading experts share experiences and make suggestions on the key elements of a future international REDD+ regime: Architecture and underlying principles, measuring, reporting and verification (MRV), private-sector involvement, the rights of indigenous people and local communities, biodiversity conservation and environmental integrity. The articles are grouped under three main topics: the lessons of existing REDD+ projects; the future REDD+ regime and the role of carbon markets; and experiences and ideas about the involvement of indigenous people and local communities. (LN)

  1. Teaching with Videogames: How Experience Impacts Classroom Integration

    Science.gov (United States)

    Bell, Amanda; Gresalfi, Melissa

    2017-01-01

    Digital games have demonstrated great potential for supporting students' learning across disciplines. But integrating games into instruction is challenging and requires teachers to shift instructional practices. One factor that contributes to the successful use of games in a classroom is teachers' experience implementing the technologies. But how…

  2. Role of IGRT in patient positioning and verification

    International Nuclear Information System (INIS)

    Mijnheer, Ben

    2008-01-01

    Image-guided radiation therapy is 'Frequent imaging in the treatment room during a course of radiotherapy to guide the treatment process'. Instrumentation related to IGRT is highlighted. Focus of the lecture was on clinical experience gained by NKI-AVL, such as the use of EPID (electronic portal imaging devices) and CBCT (cone beam computed tomography) and their comparison: good results for head and neck and prostate/bladder patients: portal imaging was replaced by CBCT. After further investigation convincing results for lung patients were obtained: portal imaging was replaced by CBCT. Scan protocols were developed for these patient groups. Since February 2004 CBCT-based decision rules have been developed for: Head and Neck (Bony anatomy); Prostate (Bony anatomy; Soft tissue registration); Lung (Bony anatomy, Soft tissue registration); Brain (Bony anatomy); and Breast, bladder and liver (in progress). Final remarks are as follows: The introduction of various IGRT techniques allowed 3D verification of the position of target volumes and organs at risk just before or during treatment. Because the information is in 3D, or sometimes even in 4D, in principle these IGRT approaches provide more information compared to the use of 2D verification methods (e.g. EPIDs). Clinical data are becoming available to assess quantitatively for which treatment techniques IGRT approaches are advantageous compared to the use of conventional verification methods taking the additional resources (time, money, manpower) into account. (P.A.)

  3. Strategy for assessment of WWER steam generator tube integrity. Report prepared within the framework of the coordinated research project on verification of WWER steam generator tube integrity

    International Nuclear Information System (INIS)

    2007-12-01

    Steam generator heat exchanger tube degradations happen in WWER Nuclear Power Plant (NPP). The situation varies from country to country and from NPP to NPP. More severe degradation is observed in WWER-1000 NPPs than in case of WWER-440s. The reasons for these differences could be, among others, differences in heat exchanger tube material (chemical composition, microstructure, residual stresses), in thermal and mechanical loadings, as well as differences in water chemistry. However, WWER steam generators had not been designed for eddy current testing which is the usual testing method in steam generators of western PWRs. Moreover, their supplier provided neither adequate methodology and criteria nor equipment for planning and implementing In-Service Inspection (ISI). Consequently, WWER steam generator ISI infrastructure was established with delay. Even today, there are still big differences in the eddy current inspection strategy and practice as well as in the approach to steam generator heat exchanger tube structural integrity assessment (plugging criteria for defective tubes vary from 40 to 90% wall thickness degradation). Recognizing this situation, the WWER operating countries expressed their need for a joint effort to develop methodology to establish reasonable commonly accepted integrity assessment criteria for the heat exchanger tubes. The IAEA's programme related to steam generator life management is embedded into the systematic activity of its Technical Working Group on Life Management of Nuclear Power Plants (TWG-LMNPP). Under the advice of the TWG-LMNPP, an IAEA coordinated research project (CRP) on Verification of WWER Steam Generator Tube Integrity was launched in 2001. It was completed in 2005. Thirteen organizations involved in in-service inspection of steam generators in WWER operating countries participated: Croatia, Czech Republic, Finland, France, Hungary, Russian Federation, Slovakia, Spain, Ukraine, and the USA. The overall objective was to

  4. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  5. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  6. Formalization of the Integral Calculus in the PVS Theorem Prover

    Science.gov (United States)

    Butler, Ricky W.

    2004-01-01

    The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.

  7. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  8. Post test calculation of the experiment 'small break loss-of- coolant test' SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    International Nuclear Information System (INIS)

    Lischke, W.; Vandreier, B.

    1997-01-01

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory

  9. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  10. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  11. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  12. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  13. Verification, validation and application of NEPTUNE-CFD to two-phase Pressurized Thermal Shocks

    Energy Technology Data Exchange (ETDEWEB)

    Mérigoux, N., E-mail: nicolas.merigoux@edf.fr [Electricité de France, R& D Division, 6 Quai Watier, 78401 Chatou (France); Laviéville, J.; Mimouni, S.; Guingo, M.; Baudry, C. [Electricité de France, R& D Division, 6 Quai Watier, 78401 Chatou (France); Bellet, S., E-mail: serge.bellet@edf.fr [Electricité de France, Thermal & Nuclear Studies and Projects Division, 12-14 Avenue Dutriévoz, 69628 Villeurbanne (France)

    2017-02-15

    Nuclear Power Plants are subjected to a variety of ageing mechanisms and, at the same time, exposed to potential Pressurized Thermal Shock (PTS) – characterized by a rapid cooling of the Reactor Pressure Vessel (RPV) wall. In this context, NEPTUNE-CFD is developed and used to model two-phase PTS in an industrial configuration, providing temperature and pressure fields required to assess the integrity of the RPV. Furthermore, when using CFD for nuclear safety demonstration purposes, EDF applies a methodology based on physical analysis, verification, validation and application to industrial scale (V&V), to demonstrate the quality of, and the confidence in results obtained. By following this methodology, each step must be proved to be consistent with the others, and with the final goal of the calculations. To this effect, a chart demonstrating how far the validation step of NEPTUNE-CFD is covering the PTS application will be drawn. A selection of the code verification and validation cases against different experiments will be described. For results consistency, a single and mature set of models – resulting from the knowledge acquired during the code development over the last decade – has been used. From these development and validation feedbacks, a methodology has been set up to perform industrial computations. Finally, the guidelines of this methodology based on NEPTUNE-CFD and SYRTHES coupling – to take into account the conjugate heat transfer between liquid and solid – will be presented. A short overview of the engineering approach will be given – starting from the meshing process, up to the results post-treatment and analysis.

  14. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    International Nuclear Information System (INIS)

    Woodruff, Henry C.; Fuangrod, Todsaporn; Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van; Bhatia, Shashank; Greer, Peter B.

    2015-01-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  15. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, Henry C., E-mail: henry.woodruff@newcastle.edu.au [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, University of Newcastle, New South Wales (Australia); Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van [Division of Medical Physics, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Bhatia, Shashank [Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia); Greer, Peter B. [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia)

    2015-11-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  16. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    Energy Technology Data Exchange (ETDEWEB)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations

  17. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  18. SNF verification requirements imposed through 10 CFR Part 961: Interrelationship with MC ampersand A requirements for the OCRWM safeguards system

    International Nuclear Information System (INIS)

    Slater, N.; Vance, S.

    1994-01-01

    Article VI.B.2 of the Standard Contract for Disposal of Spent Nuclear Fuel and/or High-Level Radioactive Waste (10 CFR Part 961, Standard Contract) provides that spent nuclear fuel shall be subject to verification prior to acceptance by the Department of Energy. As part of the overall process for scheduling deliveries of spent nuclear fuel, the Standard Contract requires contract holders to submit detailed descriptions of the spent nuclear fuel they intend to deliver. Thus, the provision for verification in the Standard contract allows the Department the opportunity to ensure that the spent nuclear fuel intended for deliver is consistent with the description provided, and that the material is properly loaded, packaged, marked, and labeled. The Department is currently evaluating the verification requirements for spent nuclear fuel that it will establish pursuant to Article VI.B.2 of the Standard Contract. The Department recognizes that there may be significant overlap between these verification requirements and the requirements established for material control and accountability (MC ampersand A). Therefore, the Department may recommend modifications to the process established in the Standard Contract to ensure that the verification and MC ampersand A functions and activities are fully integrated

  19. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  20. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  1. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  2. Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?

    Science.gov (United States)

    Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C

    2017-09-01

    Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not

  3. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry

    International Nuclear Information System (INIS)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-01-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  4. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  5. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  6. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  7. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  8. Dosimetric accuracy of Kodak EDR2 film for IMRT verifications.

    Science.gov (United States)

    Childress, Nathan L; Salehpour, Mohammad; Dong, Lei; Bloch, Charles; White, R Allen; Rosen, Isaac I

    2005-02-01

    Patient-specific intensity-modulated radiotherapy (IMRT) verifications require an accurate two-dimensional dosimeter that is not labor-intensive. We assessed the precision and reproducibility of film calibrations over time, measured the elemental composition of the film, measured the intermittency effect, and measured the dosimetric accuracy and reproducibility of calibrated Kodak EDR2 film for single-beam verifications in a solid water phantom and for full-plan verifications in a Rexolite phantom. Repeated measurements of the film sensitometric curve in a single experiment yielded overall uncertainties in dose of 2.1% local and 0.8% relative to 300 cGy. 547 film calibrations over an 18-month period, exposed to a range of doses from 0 to a maximum of 240 MU or 360 MU and using 6 MV or 18 MV energies, had optical density (OD) standard deviations that were 7%-15% of their average values. This indicates that daily film calibrations are essential when EDR2 film is used to obtain absolute dose results. An elemental analysis of EDR2 film revealed that it contains 60% as much silver and 20% as much bromine as Kodak XV2 film. EDR2 film also has an unusual 1.69:1 silver:halide molar ratio, compared with the XV2 film's 1.02:1 ratio, which may affect its chemical reactions. To test EDR2's intermittency effect, the OD generated by a single 300 MU exposure was compared to the ODs generated by exposing the film 1 MU, 2 MU, and 4 MU at a time to a total of 300 MU. An ion chamber recorded the relative dose of all intermittency measurements to account for machine output variations. Using small MU bursts to expose the film resulted in delivery times of 4 to 14 minutes and lowered the film's OD by approximately 2% for both 6 and 18 MV beams. This effect may result in EDR2 film underestimating absolute doses for patient verifications that require long delivery times. After using a calibration to convert EDR2 film's OD to dose values, film measurements agreed within 2% relative

  9. Production of plastic scintillation survey meter for clearance verification measurement

    International Nuclear Information System (INIS)

    Tachibana, Mitsuo; Shiraishi, Kunio; Ishigami, Tsutomu; Tomii, Hiroyuki

    2008-03-01

    In the Nuclear Science Research Institute, the decommissioning of various nuclear facilities is carried out according to the plan for meeting the midterm goal of the Japan Atomic Energy Agency (JAEA). An increase in the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas will be expected along with the dismantlement of nuclear facilities in the future. The radiation measurement for releasing controlled areas has been carried out in small-scale nuclear facilities including the JPDR (Japan Power Demonstration Reactor). However, the radiation measurement with an existing measuring device was difficult in effects of radiation from radioactive materials that remains in buried piping. On the other hand, there is no experience that the clearance verification measurement is executed in the JAEA. The generation of a large amount of clearance object will be expected along with the decommissioning of the nuclear facilities in the future. The plastic scintillation survey meter (hereafter, 'PL measuring device') was produced to apply to the clearance verification measurement and the radiation measurement for releasing controlled areas. The basic characteristic test and the actual test were confirmed using the PL measuring device. As a result of these tests, it was found that the evaluation value of radioactivity with the PL measuring device was accuracy equal with the existing measuring device. The PL measuring device has feature of the existing measuring device with a light weight and easy operability. The PL measuring device can correct the gamma ray too. The PL measuring device is effective to the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas. (author)

  10. Writer Identification and Verification from Intra-variable Individual Handwriting

    OpenAIRE

    Adak, Chandranath; Chaudhuri, Bidyut B.; Blumenstein, Michael

    2017-01-01

    The handwriting of an individual may vary excessively with many factors such as mood, time, space, writing speed, writing medium, utensils etc. Therefore, it becomes more challenging to perform automated writer verification/ identification on a particular set of handwritten patterns (e.g. speedy handwriting) of a person, especially when the system is trained using a different set of writing patterns (e.g. normal/medium speed) of that same person. However, it would be interesting to experiment...

  11. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  12. Dosimetry investigation of MOSFET for clinical IMRT dose verification.

    Science.gov (United States)

    Deshpande, Sudesh; Kumar, Rajesh; Ghadi, Yogesh; Neharu, R M; Kannan, V

    2013-06-01

    In IMRT, patient-specific dose verification is followed regularly at each centre. Simple and efficient dosimetry techniques play a very important role in routine clinical dosimetry QA. The MOSFET dosimeter offers several advantages over the conventional dosimeters such as its small detector size, immediate readout, immediate reuse, multiple point dose measurements. To use the MOSFET as routine clinical dosimetry system for pre-treatment dose verification in IMRT, a comprehensive set of experiments has been conducted, to investigate its linearity, reproducibility, dose rate effect and angular dependence for 6 MV x-ray beam. The MOSFETs shows a linear response with linearity coefficient of 0.992 for a dose range of 35 cGy to 427 cGy. The reproducibility of the MOSFET was measured by irradiating the MOSFET for ten consecutive irradiations in the dose range of 35 cGy to 427 cGy. The measured reproducibility of MOSFET was found to be within 4% up to 70 cGy and within 1.4% above 70 cGy. The dose rate effect on the MOSFET was investigated in the dose rate range 100 MU/min to 600 MU/min. The response of the MOSFET varies from -1.7% to 2.1%. The angular responses of the MOSFETs were measured at 10 degrees intervals from 90 to 270 degrees in an anticlockwise direction and normalized at gantry angle zero and it was found to be in the range of 0.98 ± 0.014 to 1.01 ± 0.014. The MOSFETs were calibrated in a phantom which was later used for IMRT verification. The measured calibration coefficients were found to be 1 mV/cGy and 2.995 mV/cGy in standard and high sensitivity mode respectively. The MOSFETs were used for pre-treatment dose verification in IMRT. Nine dosimeters were used for each patient to measure the dose in different plane. The average variation between calculated and measured dose at any location was within 3%. Dose verification using MOSFET and IMRT phantom was found to quick and efficient and well suited for a busy radiotherapy

  13. Automatic generation and verification of railway interlocking control tables using FSM and NuSMV

    Directory of Open Access Journals (Sweden)

    Mohammad B. YAZDI

    2009-01-01

    Full Text Available Due to their important role in providing safe conditions for train movements, railway interlocking systems are considered as safety critical systems. The reliability, safety and integrity of these systems, relies on reliability and integrity of all stages in their lifecycle including the design, verification, manufacture, test, operation and maintenance.In this paper, the Automatic generation and verification of interlocking control tables, as one of the most important stages in the interlocking design process has been focused on, by the safety critical research group in the School of Railway Engineering, SRE. Three different subsystems including a graphical signalling layout planner, a Control table generator and a Control table verifier, have been introduced. Using NuSMV model checker, the control table verifier analyses the contents of control table besides the safe train movement conditions and checks for any conflicting settings in the table. This includes settings for conflicting routes, signals, points and also settings for route isolation and single and multiple overlap situations. The latest two settings, as route isolation and multiple overlap situations are from new outcomes of the work comparing to works represented on the subject recently.

  14. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  15. Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS

    CERN Document Server

    Froidevaux, D

    2011-01-01

    Integration of Detectors Into a Large Experiment: Examples From ATLAS andCMS, part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B2: Detectors for Particles and Radiation. Part 2: Systems and Applications'. This document is part of Part 2 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Chapter '5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS' with the content: 5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS 5.1 Introduction 5.1.1 The context 5.1.2 The main initial physics goals of ATLAS and CMS at the LHC 5.1.3 A snapshot of the current status of the ATLAS and CMS experiments 5.2 Overall detector concept and magnet systems 5.2.1 Overall detector concept 5.2.2 Magnet systems 5.2.2.1 Rad...

  16. Post test calculation of the experiment `small break loss-of- coolant test` SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Lischke, W.; Vandreier, B. [Univ. for Applied Sciences, Zittau/Goerlitz (Germany). Dept. of Nuclear Technology

    1997-12-31

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory. 5 refs.

  17. Post test calculation of the experiment `small break loss-of- coolant test` SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Lischke, W; Vandreier, B [Univ. for Applied Sciences, Zittau/Goerlitz (Germany). Dept. of Nuclear Technology

    1998-12-31

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory. 5 refs.

  18. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  19. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  20. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  1. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  2. Integral method for the calculation of Hawking radiation in dispersive media. II. Asymmetric asymptotics.

    Science.gov (United States)

    Robertson, Scott

    2014-11-01

    Analog gravity experiments make feasible the realization of black hole space-times in a laboratory setting and the observational verification of Hawking radiation. Since such analog systems are typically dominated by dispersion, efficient techniques for calculating the predicted Hawking spectrum in the presence of strong dispersion are required. In the preceding paper, an integral method in Fourier space is proposed for stationary 1+1-dimensional backgrounds which are asymptotically symmetric. Here, this method is generalized to backgrounds which are different in the asymptotic regions to the left and right of the scattering region.

  3. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  4. Atomistic Galois insertions for flow sensitive integrity

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    2017-01-01

    Several program verification techniques assist in showing that software adheres to the required security policies. Such policies may be sensitive to the flow of execution and the verification may be supported by combinations of type systems and Hoare logics. However, this requires user assistance...... and to obtain full automation we shall explore the over-approximating nature of static analysis. We demonstrate that the use of atomistic Galois insertions constitutes a stable framework in which to obtain sound and fully automatic enforcement of flow sensitive integrity. The framework is illustrated...

  5. The US National Resources Defense Council/Soviet Academy of Sciences Nuclear Test Ban Verification Project

    International Nuclear Information System (INIS)

    Cochran, T.B.

    1989-01-01

    The first week in September 1987 was an extraordinary one for arms control verification. As part of the co-operative Test Ban Verification Project of the Natural Resources Defense Council (NRDC) and the Soviet Academy of Sciences, fourteen American scientists from the Scripps Institution of Oceanography (at the University of California- San Diego), University of Nevada-Reno and the University of Colorado went to the region of the Soviet's principal nuclear test site near Semipalatinsk. Together with their Soviet counterparts from the Institute of Physics of the Earth (IPE) in Moscow, they fired off three large chemical explosions. The purpose of these explosions was to demonstrate the sensitivity of the three seismic stations surrounding the test site, to study the efficiency with which high-frequency seismic waves propagate in the region, and to study differences between chemical explosions, nuclear explosions and earthquakes in order more firmly to establish procedures for verification of a nuclear test ban. This paper presents a review of the results of these experiments, an update on the status of the joint project, and a review of the significance of high frequency seismic data to test ban verification

  6. GumTree-An integrated scientific experiment environment

    International Nuclear Information System (INIS)

    Lam, Tony; Hauser, Nick; Goetz, Andy; Hathaway, Paul; Franceschini, Fredi; Rayner, Hugh; Zhang, Lidia

    2006-01-01

    GumTree is an open source and multi-platform graphical user interface for performing neutron scattering and X-ray experiments. It handles the complete experiment life cycle from instrument calibration, data acquisition, and real time data analysis to results publication. The aim of the GumTree Project is to create a highly Integrated Scientific Experiment Environment (ISEE), allowing interconnectivity and data sharing between different distributed components such as motors, detectors, user proposal database and data analysis server. GumTree is being adapted to several instrument control server systems such as TANGO, EPICS and SICS, providing an easy-to-use front-end for users and simple-to-extend model for software developers. The design of GumTree is aimed to be reusable and configurable for any scientific instrument. GumTree will be adapted to six neutron beam instruments for the OPAL reactor at ANSTO. Other European institutes including ESRF, ILL and PSI have shown interest in using GumTree as their workbench for instrument control and data analysis

  7. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  9. Formalization of the Integral Calculus in the PVS Theorem Prover

    Directory of Open Access Journals (Sweden)

    Ricky Wayne Butler

    2009-04-01

    Full Text Available The PVS Theorem prover is a widely used formal verification tool used for the analysis of safetycritical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht’s classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.

  10. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  11. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  12. Nuclear disarmament and the verification role of the IAEA

    International Nuclear Information System (INIS)

    Duarte, Carlos S.

    2008-01-01

    At the height of the cold war, nuclear arsenals reached a peak of some 70000 weapons. Although these numbers have since come down significantly, some 27000 weapons remain. The fact that decades go by and nuclear disarmament is not realised contributes to a deep sense of concern and disappointment. So do other factors, such as the persistence of nuclear doctrines that admit first use; the lack of binding negative assurances; the ongoing research on nuclear explosives including subcritical tests, and the maintaining readiness to resume full-scale testing. The sense of insufficient or outright lack of progress in nuclear disarmament is even more disturbing if measured against existing legal obligations. First and foremost among those is of course Article VI of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). According to the ICJ's Advisory Opinion, the obligation contained in Article VI is an obligation to achieve results in nuclear disarmament. Bringing the Comprehensive Nuclear Test Ban Treaty (CTBT) into force is still missing, as well as negotiating a Fissile Materials Cut-Off Treaty (FMCT). Despite significant unilateral reductions in nuclear arsenals, these have not been done within an international process that includes the commitment to total elimination. The notion that it is morally reprehensible for some countries to pursue weapons of mass destruction yet morally acceptable for others to rely on them for their security is simply unworkable. For achieving nuclear disarmament verification objectives, the IAEA clearly would have a major role to play. Under Article III.A.5 of its Statute, the Agency is allowed to apply, at the request of a State, safeguards to any of that State's nuclear activities. The Agency's capabilities and experience make it the international institution best suited to eventually perform nuclear disarmament verification tasks. In order to perform nuclear disarmament verification activities, the Agency would of course need to

  13. Simulation of integrated beam experiment designs

    International Nuclear Information System (INIS)

    Grote, D.P.; Sharp, W.M.

    2004-01-01

    Simulation of designs of an Integrated Beam Experiment (IBX) class accelerator have been carried out. These simulations are an important tool for validating such designs. Issues such as envelope mismatch and emittance growth can be examined in a self-consistent manner, including the details of injection, accelerator transitions, long-term transport, and longitudinal compression. The simulations are three-dimensional and time-dependent, and begin at the source. They continue up through the end of the acceleration region, at which point the data is passed on to a separate simulation of the drift compression. Results are be presented

  14. A method for online verification of adapted fields using an independent dose monitor

    International Nuclear Information System (INIS)

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-01-01

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields

  15. Verification of criticality safety in on-site spent fuel storage systems

    International Nuclear Information System (INIS)

    Rasmussen, R.W.

    1989-01-01

    On February 15, 1984, Duke Power Company received approval for a two-region, burnup credit, spent fuel storage rack design at both Units 1 and 2 of the McGuire Nuclear Station. Duke also hopes to obtain approval by January of 1990 for a dry spent fuel storage system at the Oconee Nuclear Station, which will incorporate the use of burnup credit in the criticality analysis governing the design of the individual storage units. While experiences in burnup verification for criticality safety for their dry storage system at Oconee are in the future, the methods proposed for burnup verification will be similar to those currently used at the McGuire Nuclear Station in the two-region storage racks installed in both pools. In conclusion, the primary benefit of the McGuire rerack effort has obviously been the amount of storage expansion it provided. A total increase of about 2,000 storage cells was realized, 1,000 of which were the result of pursuing the two-region rather than the conventional poison rack design. Less impacting, but equally as important, however, has been the experience gained during the planning, installation, and operation of these storage racks. This experience should prove useful for future rerack efforts likely to occur at Duke's Catawba Nuclear Station as well as for the current dry storage effort underway for the Oconee Nuclear Station

  16. Verification and Validation of Carbon-Fiber Laminate Low Velocity Impact Simulations.

    Energy Technology Data Exchange (ETDEWEB)

    English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy; Brown, Arthur A.

    2014-10-01

    Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importance as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.

  17. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  18. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  19. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  20. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  1. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  2. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  3. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  4. Integrating family planning into HIV care in western Kenya: HIV care providers' perspectives and experiences one year following integration.

    Science.gov (United States)

    Newmann, Sara J; Zakaras, Jennifer M; Tao, Amy R; Onono, Maricianah; Bukusi, Elizabeth A; Cohen, Craig R; Steinfeld, Rachel; Grossman, Daniel

    2016-01-01

    With high rates of unintended pregnancy in sub-Saharan Africa, integration of family planning (FP) into HIV care is being explored as a strategy to reduce unmet need for contraception. Perspectives and experiences of healthcare providers are critical in order to create sustainable models of integrated care. This qualitative study offers insight into how HIV care providers view and experience the benefits and challenges of providing integrated FP/HIV services in Nyanza Province, Kenya. Sixteen individual interviews were conducted among healthcare workers at six public sector HIV care facilities one year after the implementation of integrated FP and HIV services. Data were transcribed and analyzed qualitatively using grounded theory methods and Atlas.ti. Providers reported a number of benefits of integrated services that they believed increased the uptake and continuation of contraceptive methods. They felt that integrated services enabled them to reach a larger number of female and male patients and in a more efficient way for patients compared to non-integrated services. Availability of FP services in the same place as HIV care also eliminated the need for most referrals, which many providers saw as a barrier for patients seeking FP. Providers reported many challenges to providing integrated services, including the lack of space, time, and sufficient staff, inadequate training, and commodity shortages. Despite these challenges, the vast majority of providers was supportive of FP/HIV integration and found integrated services to be beneficial to HIV-infected patients. Providers' concerns relating to staffing, infrastructure, and training need to be addressed in order to create sustainable, cost-effective FP/HIV integrated service models.

  5. EOPs at NPP Temelin: Analytical support for EOPs verification

    International Nuclear Information System (INIS)

    Bica, M.

    1999-01-01

    The process of implementation of symptom-based emergency operating procedures (EOPs) started at the NPP Temelin in 1993. The process has the following phases: development of symptom-based EOPs; EOPs verification; EOPs validation; operating personnel training; EOPs control and experience feedback. The development of Temelin specific EOPs was based on technology and know-how transfer using the Emergency Response Guidelines Methodology developed by the Westinghouse Owner Group. In this lecture the implementation of symptom-based EOPs in the the NPP Temelin is described

  6. A Preliminary Shielding Study on the Integrated Operation Verification System in the Head-End Hot-Cell of the Pyro-processing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jinhwam; Kim, Yewon; Park, Se-Hwan; Ahn, Seong-Kyu; Cho, Gyuseong [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Nuclear power accounts for more than 30 percent of power production in Korea. Its significance has annually been increased. Disposal spent fuel containing uranium, transuranic elements, and fission products is unavoidable byproduct of nuclear power production. it is recognized that finding appropriate sites for interim storage of disposal spent fuel is not easy because isolated sites should be required. Pyro-processing technology, Pyro-processing should be operated under high radiation environment in hot-cell structures. Because of this reason, all workers should be unauthorized to access inside the hot-cell areas under any circumstances except for acceptable dose verification and a normal operation should be remotely manipulated. For the reliable normal operation of pyroprocessing, it is noted that an evaluation of the space dose distribution in the hot-cell environments is necessary in advance in order to determine which technologies or instruments can be utilized on or near the process as the Integrated Operation Verification System (IOVS) is measured. Not like the electroreduction and electro-refining hot-cells, the head-end hot-cell equips Camera Radiation Detector (CRD) in which plutonium is securely measured and monitored for the safeguard of the pyro-processing. Results have been obtained using F2 surface tally in order to observe the magnitude of the gamma-ray and neutron flux which pass through the surface of the process cell. Furthermore, T-mesh tally has also been used to obtain the space dose distribution in the headend hot-cell. The hot-cell was divided into 7,668 cells in which each dimension was 1 x 1 x 1m for the T-mesh tally. To determine the position of the CRD and the surveillance camera, divergent approaches were required. Because the purpose of the CRD which contains a gamma-ray detector and a neutron detector is to identify the material composition as the process proceeds, the position in which detectable flux is exposed is required, whereas

  7. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  8. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  9. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  10. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  11. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  12. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  13. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  14. Design and Verification of Application Specific Integrated Circuits in a Network of Online Labs

    Directory of Open Access Journals (Sweden)

    A.Y. Al-Zoubi

    2009-08-01

    Full Text Available A solution to implement a remote laboratory for testing and designing analog Application-Specific Integrated Circuits of the type (ispPAC10 is presented. The application allows electrical engineering students to access and perform measurements and conduct analog electronics experiments over the internet. PAC-Designer software, running on a Citrix server, is used in the circuit design in which the signals are generated and the responses are acquired by a data acquisition board controlled by LabVIEW. Three interconnected remote labs located in three different continents will be implementing the proposed system.

  15. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  16. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  17. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  18. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance

  19. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    International Nuclear Information System (INIS)

    Mueller, Christina; Oeberg, Tomas

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  20. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  1. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  2. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  3. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  4. Review of recent benchmark experiments on integral test for high energy nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi; Tanaka, Susumu; Konno, Chikara; Fukahori, Tokio; Hayashi, Katsumi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-11-01

    A survey work of recent benchmark experiments on an integral test for high energy nuclear data evaluation was carried out as one of the work of the Task Force on JENDL High Energy File Integral Evaluation (JHEFIE). In this paper the results are compiled and the status of recent benchmark experiments is described. (author)

  5. Assessing patients’ experience of integrated care: a survey of patient views in the North West London Integrated Care Pilot

    Directory of Open Access Journals (Sweden)

    Nikolaos Mastellos

    2014-06-01

    Full Text Available Introduction: Despite the importance of continuity of care and patient engagement, few studies have captured patients’ views on integrated care. This study assesses patient experience in the Integrated Care Pilot in North West London with the aim to help clinicians and policy makers understand patients’ acceptability of integrated care and design future initiatives. Methods: A survey was developed, validated and distributed to 2029 randomly selected practice patients identified as having a care plan. Results: A total of 405 questionnaires were included for analysis. Respondents identified a number of benefits associated with the pilot, including increased patient involvement in decision-making, improved patient-provider relationship, better organisation and access to care, and enhanced inter-professional communication. However, only 22.4% were aware of having a care plan, and of these only 37.9% had a copy of the care plan. Knowledge of care plans was significantly associated with a more positive experience. Conclusions: This study reinforces the view that integrated care can improve quality of care and patient experience. However, care planning was a complex and technically challenging process that occurred more slowly than planned with wide variation in quality and time of recruitment to the pilot, making it difficult to assess the sustainability of benefits.

  6. Colleges' Experiences: Integrating Support Services for Military Veterans

    Science.gov (United States)

    Karp, Melinda Mechur; Klempin, Serena

    2017-01-01

    To improve the educational experiences and outcomes of student veterans, the Kisco Foundation developed the Kohlberg Prize in 2015. Two cohorts of colleges were awarded competitive grants to enhance their veterans services. This piece examines the process of creating integrated services for student veterans through the institutionalization of…

  7. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  8. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  9. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  10. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  11. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  12. GumTree - An Integrated Scientific Experiment Environment

    International Nuclear Information System (INIS)

    Lam, Tony; Hauser, Nick; Hathaway, Paul; Franceschini, Fredi; Rayner, Hugh; Zhang, Lidia; Goetz, Andy

    2005-01-01

    Full text: GumTree is an open source and multi-platform graphical user interface for performing neutron scattering and X-ray experiments. It handles the complete experiment life cycle from instrument calibration, data acquisition, and real time data analysis to results publication. The aim of the GumTree Project is to create a highly Integrated Scientific Experiment Environment (ISEE), allowing interconnectivity and data sharing between different distributed components such as motors, detectors, user proposal database and data analysis server. GumTree is being adapted to several instrument control server systems such as TANGO, EPICS and SICS, providing an easy-to-use front-end for users and simple-to-extend model for software developers. The design of GumTree is aimed to be reusable and configurable for any scientific instrument. GumTree will be adapted to six neutron beam instruments for the OPAL reactor at ANSTO. Other European institutes including ESRF, ILL and PSI have shown interest in using GumTree as their workbench for instrument control and data analysis. (authors)

  13. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  14. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr......Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...

  15. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  16. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  17. Verification of possible asymmetry of polarization of thermal neutrons reflected by a mirror

    International Nuclear Information System (INIS)

    Okorokov, A.I.; Runov, V.V.; Gukasov, A.G.; Shchebetov, A.F.

    1976-01-01

    Experiments with a polarizing neutron guide do not confirm the neutron polarization asymmetry observed previously by Berndorfer for neutrons traversing a polarizing neutron guide. In connection with the spin-orbit effects a verification is carried out on single reflection of neutrons by magnetic or nonmagnetic mirrors. With an accuracy of 10 -4 -10 -3 no polarization asymmetry is observed

  18. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  19. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  20. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  1. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  2. Accurate Electromagnetic Modeling Methods for Integrated Circuits

    NARCIS (Netherlands)

    Sheng, Z.

    2010-01-01

    The present development of modern integrated circuits (IC’s) is characterized by a number of critical factors that make their design and verification considerably more difficult than before. This dissertation addresses the important questions of modeling all electromagnetic behavior of features on

  3. The iso-response method: measuring neuronal stimulus integration with closed-loop experiments

    Science.gov (United States)

    Gollisch, Tim; Herz, Andreas V. M.

    2012-01-01

    Throughout the nervous system, neurons integrate high-dimensional input streams and transform them into an output of their own. This integration of incoming signals involves filtering processes and complex non-linear operations. The shapes of these filters and non-linearities determine the computational features of single neurons and their functional roles within larger networks. A detailed characterization of signal integration is thus a central ingredient to understanding information processing in neural circuits. Conventional methods for measuring single-neuron response properties, such as reverse correlation, however, are often limited by the implicit assumption that stimulus integration occurs in a linear fashion. Here, we review a conceptual and experimental alternative that is based on exploring the space of those sensory stimuli that result in the same neural output. As demonstrated by recent results in the auditory and visual system, such iso-response stimuli can be used to identify the non-linearities relevant for stimulus integration, disentangle consecutive neural processing steps, and determine their characteristics with unprecedented precision. Automated closed-loop experiments are crucial for this advance, allowing rapid search strategies for identifying iso-response stimuli during experiments. Prime targets for the method are feed-forward neural signaling chains in sensory systems, but the method has also been successfully applied to feedback systems. Depending on the specific question, “iso-response” may refer to a predefined firing rate, single-spike probability, first-spike latency, or other output measures. Examples from different studies show that substantial progress in understanding neural dynamics and coding can be achieved once rapid online data analysis and stimulus generation, adaptive sampling, and computational modeling are tightly integrated into experiments. PMID:23267315

  4. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  5. SIM-DSP: A DSP-Enhanced CAD Platform for Signal Integrity Macromodeling and Simulation

    Directory of Open Access Journals (Sweden)

    Chi-Un Lei

    2014-12-01

    Full Text Available Macromodeling-Simulation process for signal integrity verifications has become necessary for the high speed circuit system design. This paper aims to introduce a “VLSI Signal Integrity Macromodeling and Simulation via Digital Signal Processing Techniques” framework (known as SIM-DSP framework, which applies digital signal processing techniques to facilitate the SI verification process in the pre-layout design phase. Core identification modules and peripheral (pre-/post-processing modules have been developed and assembled to form a verification flow. In particular, a single-step discrete cosine transform truncation (DCTT module has been developed for modeling-simulation process. In DCTT, the response modeling problem is classified as a signal compression problem, wherein the system response can be represented by a truncated set of non-pole based DCT bases, and error can be analyzed through Parseval’s theorem. Practical examples are given to show the applicability of our proposed framework.

  6. Project report: Experimental planning and verification of working fluids (WP 5)

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi

    working fluid candidates a database is required that can be simultaneously searched in order to differentiate and determine whether the generated candidates are existing or novel. Also, the next step upon selection of the candidates is performing experiments in order to test and verify the generated...... working fluids. If performed properly, the experimental step is solely verification. Experiments can either be performed virtually (in order to further reduce the number of required experiments) and/or physically. Therefore the objective of this work was the development of a database of existing working......Computer-aided molecular design (CAMD) helps in the reduction of experiments for the selection/design of optimal working fluids. In reducing the number of experiments, solutions obtain by trial and error is replaced by solutions that are based on mixture-process properties. In generating optimal...

  7. Tablets in K-12 Education: Integrated Experiences and Implications

    Science.gov (United States)

    An, Heejung, Ed.; Alon, Sandra, Ed.; Fuentes, David, Ed.

    2015-01-01

    The inclusion of new and emerging technologies in the education sector has been a topic of interest to researchers, educators, and software developers alike in recent years. Utilizing the proper tools in a classroom setting is a critical factor in student success. "Tablets in K-12 Education: Integrated Experiences and Implications"…

  8. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  9. Role of materials accounting in integrated safeguards systems for reprocessing plants

    International Nuclear Information System (INIS)

    Hakkila, E.A.; Gutmacher, R.G.; Markin, J.T.; Shipley, J.P.; Whitty, W.J.

    1981-01-01

    Integration of materials accounting and containment/surveillance techniques for international safeguards requires careful examination and definition of suitable inspector activities for verification of operator's materials accounting data. The inspector's verification procedures are designed to protect against data falsification and/or the use of measurement uncertainties to conceal missing material. Materials accounting activities are developed to provide an effective international safeguards system when combined with containment/surveillance activities described in a companion paper

  10. Research on integrated simulation of fluid-structure system by computation science techniques

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1996-01-01

    In Power Reactor and Nuclear Fuel Development Corporation, the research on the integrated simulation of fluid-structure system by computation science techniques has been carried out, and by its achievement, the verification of plant systems which has depended on large scale experiments is substituted by computation science techniques, in this way, it has been aimed at to reduce development costs and to attain the optimization of FBR systems. For the purpose, it is necessary to establish the technology for integrally and accurately analyzing complicated phenomena (simulation technology), the technology for applying it to large scale problems (speed increasing technology), and the technology for assuring the reliability of the results of analysis when simulation technology is utilized for the permission and approval of FBRs (verifying technology). The simulation of fluid-structure interaction, the heat flow simulation in the space with complicated form and the related technologies are explained. As the utilization of computation science techniques, the elucidation of phenomena by numerical experiment and the numerical simulation as the substitute for tests are discussed. (K.I.)

  11. Entanglement verification and its applications in quantum communication; Verschraenkungsnachweise mit Anwendungen in der Quantenkommunikation

    Energy Technology Data Exchange (ETDEWEB)

    Haeseler, Hauke

    2010-02-16

    coherent storage of light, we focus on the storage of squeezed light. This situation requires an extension of our verification procedure to sources of mixed input states. We propose such an extension, and give a detailed analysis of its application to squeezed thermal states, displaced thermal states and mixed qubit states. This is supplemented by finding the optimal entanglement-breaking channels for each of these situations, which provides us with an indication of the strength of the extension to our entanglement criterion. The subject of Chapter 6 is also the benchmarking of quantum memory or teleportation experiments. Considering a number of recently published benchmark criteria, we investigate the question which one is most useful to actual experiments. We first compare the different criteria for typical settings and sort them according to their resilience to excess noise. Then, we introduce a further improvement to the Expectation Value Matrix method, which results in the desired optimal benchmark criterion. Finally, we investigate naturally occurring phase fluctuations and find them to further simplify the implementation of our criterion. Thus, we formulate the first truly useful way of validating experiments for the quantum storage or transmission of light. (orig.)

  12. The design and analysis of integral assembly experiments for CTR neutronics

    International Nuclear Information System (INIS)

    Beynon, T.D.; Curtis, R.H.; Lambert, C.

    1978-01-01

    The use of simple-geometry integral assemblies of lithium metal or lithium compounds for the study of the neutronics of various CTR designs is considered and four recent experiments are analysed. The relatively long mean free path of neutrons in these assemblies produces significantly different design problems from those encountered in similar experiments for fission reactor design. By considering sensitivity profiles for various parameters it is suggested that experiments can be designed to be optimised for data adjustments. (author)

  13. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  14. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  15. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  16. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  17. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  18. Experiences with integral microelectronics on smart structures for space

    Science.gov (United States)

    Nye, Ted; Casteel, Scott; Navarro, Sergio A.; Kraml, Bob

    1995-05-01

    One feature of a smart structure implies that some computational and signal processing capability can be performed at a local level, perhaps integral to the controlled structure. This requires electronics with a minimal mechanical influence regarding structural stiffening, heat dissipation, weight, and electrical interface connectivity. The Advanced Controls Technology Experiment II (ACTEX II) space-flight experiments implemented such a local control electronics scheme by utilizing composite smart members with integral processing electronics. These microelectronics, tested to MIL-STD-883B levels, were fabricated with conventional thick film on ceramic multichip module techniques. Kovar housings and aluminum-kapton multilayer insulation was used to protect against harsh space radiation and thermal environments. Development and acceptance testing showed the electronics design was extremely robust, operating in vacuum and at temperature range with minimal gain variations occurring just above room temperatures. Four electronics modules, used for the flight hardware configuration, were connected by a RS-485 2 Mbit per second serial data bus. The data bus was controlled by Actel field programmable gate arrays arranged in a single master, four slave configuration. An Intel 80C196KD microprocessor was chosen as the digital compensator in each controller. It was used to apply a series of selectable biquad filters, implemented via Delta Transforms. Instability in any compensator was expected to appear as large amplitude oscillations in the deployed structure. Thus, over-vibration detection circuitry with automatic output isolation was incorporated into the design. This was not used however, since during experiment integration and test, intentionally induced compensator instabilities resulted in benign mechanical oscillation symptoms. Not too surprisingly, it was determined that instabilities were most detectable by large temperature increases in the electronics, typically

  19. Implementation of the structural integrity analysis for PWR primary components and piping

    International Nuclear Information System (INIS)

    Pellissier-Tanon, A.

    1982-01-01

    The trends on the definition, the assessment and the application of fracture strength evaluation methodology, which have arisen through experience in the design, construction and operation of French 900-MW plants are reviewed. The main features of the methodology proposed in a draft of Appendix ZG of the RCC-M code of practice for the design verification of fracture strength of primary components are presented. The research programs are surveyed and discussed from four viewpoints, first implementation of the LEFM analysis, secondly implementation of the fatigue crack propagation analysis, thirdly analysis of vessel integrity during emergency core cooling, and fourthly methodology for tear fracture analysis. (author)

  20. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  1. An integrated technique for developing real-time systems

    NARCIS (Netherlands)

    Hooman, J.J.M.; Vain, J.

    1995-01-01

    The integration of conceptual modeling techniques, formal specification, and compositional verification is considered for real time systems within the knowledge engineering context. We define constructive transformations from a conceptual meta model to a real time specification language and give

  2. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  3. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  4. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  5. Verification Failures: What to Do When Things Go Wrong

    Science.gov (United States)

    Bertacco, Valeria

    Every integrated circuit is released with latent bugs. The damage and risk implied by an escaped bug ranges from almost imperceptible to potential tragedy; unfortunately it is impossible to discern within this range before a bug has been exposed and analyzed. While the past few decades have witnessed significant efforts to improve verification methodology for hardware systems, these efforts have been far outstripped by the massive complexity of modern digital designs, leading to product releases for which an always smaller fraction of system's states has been verified. The news of escaped bugs in large market designs and/or safety critical domains is alarming because of safety and cost implications (due to replacements, lawsuits, etc.).

  6. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  7. SASSYS-1 computer code verification with EBR-II test data

    International Nuclear Information System (INIS)

    Warinner, D.K.; Dunn, F.E.

    1985-01-01

    The EBR-II natural circulation experiment, XX08 Test 8A, is simulated with the SASSYS-1 computer code and the results for the latter are compared with published data taken during the transient at selected points in the core. The SASSYS-1 results provide transient temperature and flow responses for all points of interest simultaneously during one run, once such basic parameters as pipe sizes, initial core flows, and elevations are specified. The SASSYS-1 simulation results for the EBR-II experiment XX08 Test 8A, conducted in March 1979, are within the published plant data uncertainties and, thereby, serve as a partial verification/validation of the SASSYS-1 code

  8. Provenance based data integrity checking and verification in cloud environments

    Science.gov (United States)

    Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151

  9. Provenance based data integrity checking and verification in cloud environments.

    Science.gov (United States)

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  10. Provenance based data integrity checking and verification in cloud environments.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    Full Text Available Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  11. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  12. Comment for nuclear data from the FNS integral experiments

    International Nuclear Information System (INIS)

    Maekawa, Hiroshi

    1983-01-01

    Among the integral experiments that were carried out during last one year at FNS, the following three experimental results and their analyses are discribed. 1) Tritium production-rate distribution in a Li 2 O-C assembly, 2) Angle dependent neutron leakage spectra from Li 2 O slab assemblies, 3) Induced activity of Type 316 stainless steel. (author)

  13. Experience and Strategy of Biodiversity Data Integration in Taiwan

    Directory of Open Access Journals (Sweden)

    K T Shao

    2013-02-01

    Full Text Available The integration of Taiwan's biodiversity databases started in 2001, the same year that Taiwan joined GBIF as an associate participant. Taiwan, hence, embarked on a decade of integrating biodiversity data. Under the support of NSC and COA, the database and websites of TaiBIF, TaiBNET (TaiCOL, TaiBOL, and TaiEOL have been established separately and collaborate with the GBIF, COL, BOL, and EOL respectively. A cross-agency committee was thus established in Academia Sinica in 2008 to formulate policies on data collection and integration as well as the mechanism to make data available to the public. Any commissioned project will hereafter be asked to include these policy requirements in its contract. So far, TaiBIF has gained recognition in Taiwan and abroad for its efforts over the past several years. It can provide its experience and insights for others to reference or replicate.

  14. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  15. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  16. TET-1- A German Microsatellite for Technology On -Orbit Verification

    Science.gov (United States)

    Föckersperger, S.; Lattner, K.; Kaiser, C.; Eckert, S.; Bärwald, W.; Ritzmann, S.; Mühlbauer, P.; Turk, M.; Willemsen, P.

    2008-08-01

    Due to the high safety standards in the space industry every new product must go through a verification process before qualifying for operation in a space system. Within the verification process the payload undergoes a series of tests which prove that it is in accordance with mission requirements in terms of function, reliability and safety. Important verification components are the qualification for use on the ground as well as the On-Orbit Verification (OOV), i.e. proof that the product is suitable for use under virtual space conditions (on-orbit). Here it is demonstrated that the product functions under conditions which cannot or can only be partially simulated on the ground. The OOV-Program of the DLR serves to bridge the gap between the product tested and qualified on the ground and the utilization of the product in space. Due to regular and short-term availability of flight opportunities industry and research facilities can verify their latest products under space conditions and demonstrate their reliability and marketability. The Technologie-Erprobungs-Tr&äger TET (Technology Experiments Carrier) comprises the core elements of the OOV Program. A programmatic requirement of the OOV Program is that a satellite bus already verified in orbit be used in the first segment of the program. An analysis of suitable satellite buses showed that a realization of the TET satellite bus based on the BIRD satellite bus fulfilled the programmatic requirements best. Kayser-Threde was selected by DLR as Prime Contractor to perform the project together with its major subcontractors Astro- und Feinwerktechnik, Berlin for the platform development and DLR-GSOC for the ground segment development. TET is now designed to be a modular and flexible micro-satellite for any orbit between 450 and 850 km altitude and inclination between 53° and SSO. With an overall mass of 120 kg TET is able to accommodate experiments of up to 50 kg. A multipurpose payload supply systemThere is

  17. Neutron radiography experiments for verification of soluble boron mixing and transport modeling under natural circulation conditions

    International Nuclear Information System (INIS)

    Morlang, M.M.; Feltus, M.A.

    1996-01-01

    The use of neutron radiography for visualization of fluid flow through flow visualization modules has been very successful. Current experiments at the Penn State Breazeale Reactor serve to verify the mixing and transport of soluble boron under natural flow conditions as would be experienced in a pressurized water reactor. Different flow geometries have been modeled including holes, slots, and baffles. Flow modules are constructed of aluminum box material 1 1/2 inches by 4 inches in varying lengths. An experimental flow system was built which pumps fluid to a head tank and natural circulation flow occurs from the head tank through the flow visualization module to be radio-graphed. The entire flow system is mounted on a portable assembly to allow placement of the flow visualization module in front of the neutron beam port. A neutron-transparent fluor-inert fluid is used to simulate water at different densities. Boron is modeled by gadolinium oxide powder as a tracer element, which is placed in a mixing assembly and injected into the system a remotely operated electric valve, once the reactor is at power. The entire sequence is recorded on real-time video. Still photographs are made frame-by-frame from the video tape. Computers are used to digitally enhance the video and still photographs. The data obtained from the enhancement will be used for verification of simple geometry predictions using the TRAC and RELAP thermal-hydraulic codes. A detailed model of a reactor vessel inlet plenum, downcomer region, flow distribution area and core inlet is being constructed to model the APGOO plenum. Successive radiography experiments of each section of the model under identical conditions will provide a complete vessel / core model for comparison with the thermal-hydraulic codes

  18. Neutron radiography experiments for verification of soluble boron mixing and transport modeling under natural circulation conditions

    International Nuclear Information System (INIS)

    Feltus, M.A.; Morlang, G.M.

    1996-01-01

    The use of neutron radiography for visualization of fluid flow through flow visualization modules has been very successful. Current experiments at the Penn State Breazeale Reactor serve to verify the mixing and transport of soluble boron under natural flow conditions as would be experienced in a pressurized water reactor. Different flow geometries have been modeled including holes, slots, and baffles. Flow modules are constructed of aluminum box material 1 1/2 inches by 4 inches in varying lengths. An experimental flow system was built which pumps fluid to a head tank and natural circulation flow occurs from the head tank through the flow visualization module to be radiographed. The entire flow system is mounted on a portable assembly to allow placement of the flow visualization module in front of the neutron beam port. A neutron-transparent fluorinert fluid is used to simulate water at different densities. Boron is modeled by gadolinium oxide powder as a tracer element, which is placed in a mixing assembly and injected into the system by remote operated electric valve, once the reactor is at power. The entire sequence is recorded on real-time video. Still photographs are made frame-by-frame from the video tape. Computers are used to digitally enhance the video and still photographs. The data obtained from the enhancement will be used for verification of simple geometry predictions using the TRAC and RELAP thermal-hydraulic codes. A detailed model of a reactor vessel inlet plenum, downcomer region, flow distribution area and core inlet is being constructed to model the AP600 plenum. Successive radiography experiments of each section of the model under identical conditions will provide a complete vessel/core model for comparison with the thermal-hydraulic codes

  19. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  20. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.