WorldWideScience

Sample records for mftf sensor verification

  1. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  2. MFTF-progress and promise

    International Nuclear Information System (INIS)

    Thomassen, K.I.

    1980-01-01

    The Mirror Fusion Test Facility (MFTF) has been in construction at Lawrence Livermore National Laboratory (LLNL) for 3 years, and most of the major subsystems are nearing completion. Recently, the scope of this project was expanded to meet new objectives, principally to reach plasma conditions corresponding to energy break-even. To fulfill this promise, the single-cell minimum-B mirror configuration will be replaced with a tandem mirror configuration (MFTF-B). The facility must accordingly be expanded to accomodate the new geometry. This paper briefly discusses the status of the major MFTF subsystems and describes how most of the technological objectives of MFTF will be demonstrated before we install the additional systems necessary to make the tandem. It also summarizes the major features of the expanded facility

  3. MFTF-B plasma-diagnostic system

    International Nuclear Information System (INIS)

    Throop, A.L.; Goerz, D.A.; Thomas, S.R.

    1981-01-01

    This paper describes the current design status of the plasma diagnostic system for MFTF-B. In this paper we describe the system requirement changes which have occurred as a result of the funded rescoping of the original MFTF facility into MFTF-B. We outline the diagnostic instruments which are currently planned, and present an overview of the diagnostic system

  4. Manufacturing the MFTF magnet

    International Nuclear Information System (INIS)

    Dalder, E.N.C.; Hinkle, R.E.; Hodges, A.J.

    1980-01-01

    The Mirror Fusion Test Facility (MFTF) is a large mirror program experiment for magnetic fusion energy. It will combine and extend the near-classical plasma confinement achieved in 2XIIB with advanced neutral-beam and magnet technologies. The product of ion density and confinement time will be improved more than an order of magnitude, while the superconducting magnet weight will be extrapolated from 15 tons in Baseball II to 375 tons in MFTF. Recent reactor studies show that the MFTF will traverse much of the distance in magnet technology towards the reactor regime

  5. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  6. Design of the drift pumping system for MFTF-α+T

    International Nuclear Information System (INIS)

    Metlzer, D.H.

    1983-01-01

    Drift pumping in mirrors is a new concept (less than one year old). If it works, compared to charge-exchange pumping, it will simplify the MFTF-α+T interface and possibly reduce the circulating power required. From an engineering standpoint, it has some very demanding requirements in terms of power and bandwidth. This paper describes a design which satisfies these requirements. It also identifies a number of promising alternatives requiring investigation and verification

  7. Evaluating and tuning system response in the MFTF-B control and diagnostics computers

    International Nuclear Information System (INIS)

    Palasek, R.L.; Butner, D.N.; Minor, E.G.

    1983-01-01

    The software system running on the Supervisory Control and Diagnostics System (SCDS) of MFTF-B is, for the major part, an event driven one. Regular, periodic polling of sensors' outputs takes place only at the local level, in the sensors' corresponding local control microcomputers (LCC's). An LCC reports a sensor's value to the supervisory computer only if there was a significant change. This report is passed as a message, routed among and acted upon by a network of applications and systems tasks within the supervisory computer (SCDS). Commands from the operator's console are similarly routed through a network of tasks, but in the oppostie direction to the experiment's hardware. In a network such as this, response time is partialy determined by system traffic. Because the hardware of MFTF-B will not be connected to the computer system for another two years, we are using the local control computers to simulate the event driven traffic that we expect to see during MFTF-B operation. In this paper we show how we are using the simulator to measure and evaluate response, loading, throughput, and utilization of components within the computer system. Measurement of the system under simulation allows us to identify bottlenecks and verify their unloosening. We also use the traffic simulators to evaluate prototypes of different algorithms for selected tasks, comparing their responses under the spectrum of traffic intensities

  8. Overview of the MFTF electrical systems

    International Nuclear Information System (INIS)

    Lindquist, W.B.; Eckard, R.D.; Holdsworth, T.; Mooney, L.J.; Moyer, D.R.; Peterson, R.L.; Shimer, D.W.; Wyman, R.H.; VanNess, H.W.

    1979-01-01

    The Mirror Fusion Test Facility, scheduled for completion in October 1981, will contain a complex, state-of-the-art array of electrical and electronics equipment valued at over 60 M$. Three injector systems will be employed to initiate and sustain the MFTF deuterium plasma. A plasma streaming system and a startup neutron beam system will be used to establish a target plasma. A sustaining neutral beam system will be used to fuel and sustain the MFTF plasma for 0.5 s. Additional power supply systems required on MFTF include two magnet power supplies with quench protection circuitry for powering the superconducting YIN/YANG magnet pair and eight 10 KHz power supplies for powering the Ti gettering system. Due to the complexity, physical size, and multiple systems of MFTF, a distributed, hierarchial, computer control and instrumentation system will be used. Color graphic, touch-panel, control consoles will provide the man-machine interface. The MFTF will have the capability of conducting an experiment every five minutes

  9. MFTF-α + T progress report

    International Nuclear Information System (INIS)

    Nelson, W.D.

    1985-04-01

    Early in FY 1983, several upgrades of the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) were proposed to the fusion community. The one most favorably received was designated MFTF-α+T. The engineering design of this device, guided by LLNL, has been a principal activity of the Fusion Engineering Design Center during FY 1983. This interim progress report represents a snapshot of the device design, which was begun in FY 1983 and will continue for several years. The report is organized as a complete design description. Because it is an interim report, some parts are incomplete; they will be supplied as the design study proceeds. As described in this report, MFTF-α+T uses existing facilities, many MFTF-B components, and a number of innovations to improve on the physics parameters of MFTF-B. It burns deuterium-tritium and has a central-cell Q of 2, a wall loading GAMMA/sub n/ of 2 MW/m 2 (with a central-cell insert module), and an availability of 10%. The machine is fully shielded, allows hands-on maintenance of components outside the vacuum vessel 24 h after shutdown, and has provisions for repair of all operating components

  10. MFTF exception handling system

    International Nuclear Information System (INIS)

    Nowell, D.M.; Bridgeman, G.D.

    1979-01-01

    In the design of large experimental control systems, a major concern is ensuring that operators are quickly alerted to emergency or other exceptional conditions and that they are provided with sufficient information to respond adequately. This paper describes how the MFTF exception handling system satisfies these requirements. Conceptually exceptions are divided into one of two classes. Those which affect command status by producing an abort or suspend condition and those which fall into a softer notification category of report only or operator acknowledgement requirement. Additionally, an operator may choose to accept an exception condition as operational, or turn off monitoring for sensors determined to be malfunctioning. Control panels and displays used in operator response to exceptions are described

  11. Fusion blanket testing in MFTF-α + T

    International Nuclear Information System (INIS)

    Kleefeldt, K.

    1985-01-01

    The Mirror Fusion Test Facility-α + T (MFTF-α + T) is an upgraded version of the current MFTF-B test facility at Lawrence Livermore National Laboratory, and is designed for near-term fusion-technology-integrated tests at a neutron flux of 2 MW/m 2 . Currently, the fusion community is screening blanket and related issues to determine which ones can be addressed using MFTF-α + T. In this work, the minimum testing needs to address these issues are identified for the liquid-metal-cooled blanket and the solid-breeder blanket. Based on the testing needs and on the MFTF-α + T capability, a test plan is proposed for three options; each option covers a six to seven year testing phase. The options reflect the unresolved question of whether to place the research and development (R and D) emphasis on liquid-metal or solid-breeder blankets. In each case, most of the issues discussed can be addressed to a reasonable extent in MFTF-α+T

  12. MFTF-. cap alpha. + T progress report

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.D. (ed.)

    1985-04-01

    Early in FY 1983, several upgrades of the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) were proposed to the fusion community. The one most favorably received was designated MFTF-..cap alpha..+T. The engineering design of this device, guided by LLNL, has been a principal activity of the Fusion Engineering Design Center during FY 1983. This interim progress report represents a snapshot of the device design, which was begun in FY 1983 and will continue for several years. The report is organized as a complete design description. Because it is an interim report, some parts are incomplete; they will be supplied as the design study proceeds. As described in this report, MFTF-..cap alpha..+T uses existing facilities, many MFTF-B components, and a number of innovations to improve on the physics parameters of MFTF-B. It burns deuterium-tritium and has a central-cell Q of 2, a wall loading GAMMA/sub n/ of 2 MW/m/sup 2/ (with a central-cell insert module), and an availability of 10%. The machine is fully shielded, allows hands-on maintenance of components outside the vacuum vessel 24 h after shutdown, and has provisions for repair of all operating components.

  13. Start-up neutral-beam power supply system for MFTF

    International Nuclear Information System (INIS)

    Mooney, L.J.

    1979-01-01

    This paper describes some of the design features and considerations of the MFTF start-up neutral-beam power supplies. In particular, we emphasize features of the system that will ensure MFTF compatibility and achieve the required reliability/availability for the MFTF to be successful

  14. Thermal performance of the MFTF magnets

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1983-01-01

    A yin-yang pair of liquid-helium (LHe) cooled, superconducting magnets were tested last year at the Lawrence Livermore National Laboratory (LLNL) as part of a series of tests with the Mirror Fusion Test Facility (MFTF). These tests were performed to determine the success of engineering design used in major systems of the MFTF and to provide a technical base for rescoping from a single-mirror facility to the large tandem-mirror configuration (MFTF-B) now under construction. The magnets were cooled, operated at their design current and magnetic field, and warmed to atmospheric temperature. In this report, we describe their thermal behavior during these tests

  15. A spheromak ignition experiment reusing Mirror Fusion Test Facility (MFTF) equipment

    International Nuclear Information System (INIS)

    Fowler, T.K.

    1993-01-01

    Based on available experimental results and theory, a scenario is presented to achieve ohmic ignition in a spheromak by slow (∼ 10 sec.) helicity injection using power from the Mirror Fusion Test Facility (MFTF) substation. Some of the other parts needed (vacuum vessel, coils, power supplies, pumps, shielded building space) might also be obtained from MFTF or other salvage, as well as some components needed for intermediate experiments for additional verification of the concept (especially confinement scaling). The proposed ignition experiment would serve as proof-of-principle for the spheromak DT fusion reactor design published by Hagenson and Krakowski, with a nuclear island cost about ten times less than a tokamak of comparable power. Designs at even higher power density and lower cost might be possible using Christofilos' concept of a liquid lithium blanket. Since all structures would be protected from neutrons by the lithium blanket and the tritium inventory can be reduced by continuous removal from the liquid blanket, environmental and safety characteristics appear to be favorable

  16. Magnetic shielding tests for MFTF-B neutral beamlines

    International Nuclear Information System (INIS)

    Kerns, J.; Fabyan, J.; Wood, R.; Koger, P.

    1983-01-01

    A test program to determine the effectiveness of various magnetic shielding designs for MFTF-B beamlines was established at Lawrence Livermore National Laboratory (LLNL). The proposed one-tenth-scale shielding-design models were tested in a uniform field produced by a Helmholtz coil pair. A similar technique was used for the MFTF source-injector assemblies, and the model test results were confirmed during the Technology Demonstration in 1982. The results of these tests on shielding designs for MFTF-B had an impact on the beamline design for MFTF-B. The iron-core magnet and finger assembly originally proposed were replaced by a simple, air-core, race-track-coil, bending magnet. Only the source injector needs to be magnetically shielded from the fields of approximately 400 gauss

  17. Design of the electromagnetic fluctuations diagnostic for MFTF-B

    International Nuclear Information System (INIS)

    House, P.A.; Goerz, D.A.; Martin, R.

    1983-01-01

    The Electromagnetic Fluctuations (EMF) diagnostic will be used to monitor ion fluctuations which could be unstable in MFTF-B. Each probe assembly includes a high impedance electrostatic probe to measure potential fluctuations, and a group of nested, single turn loops to measure magnetic fluctuations in three directions. Eventually, more probes and loops will be added to each probe assembly for making more detailed measurements. The sensors must lie physically close to the plasma edge and are radially positionable. Also, probes at separate axial locations can be positioned to connect along the same magnetic field line. These probes are similar in concept to the rf probes used on TMX, but the high thermal load for 30-second shots on MFTF-B requires a water-cooled design along with temperature monitors. Each signal channel has a bandwidth of .001 to 150 MHz and is monitored by up to four different data channels which obtain amplitude and frequency information. This paper describes the EMF diagnostic and presents the detailed mechanical and electrical designs

  18. MFTF-α+T end plug magnet design

    International Nuclear Information System (INIS)

    Srivastava, V.C.; O'Toole, J.A.

    1983-01-01

    The conceptual design of the end-plug magnets for MFTF-α+T is described. MFTF-α+ T is a near-term upgrade of MFTF-B, which features new end plugs to improve performance. The Fusion Engineering Design Center has performed the engineering design of MFTF-α+T under the overall direction of Lawrence Livermore National Laboratory. Each end plug consists of two Yin-Yang pairs, each with approx.2.5:1 mirror ratio and approx.5-T peak field on axis; two transition coils; and a recircularizing solenoid. This paper describes the end-plug magnet system functional requirements and presents a conceptual design that meets them. The peak field at the windings of the end-plug coils is approx.6-T. These coils are designed using the NbTi MFTF-B conductor and cooled by a 4.2K liquid helium bath. All the end-plug magnets are designed to operate in the cryostable mode with adequate quench protection for safety. Shielding requirements are stated and a summary of heat loads is provided. Field and force calculations are discussed. The field on axis is shown to meet the functional requirements. Force resultants are reported in terms of winding running loads and resultant coil forces are also given. The magnet structural support is described. A trade study to determine the optimum end-cell coil internal nuclear shield thickness and the resulting coil size based on minimizing the end-cell life cycle cost is summarized

  19. Field-reversal experiments in the mirror fusion test facility (MFTF)

    International Nuclear Information System (INIS)

    Shearer, J.W.; Condit, W.C.

    1977-01-01

    Detailed consideration of several aspects of a field-reversal experiment was begun in the Mirror Fusion Test Facility (MFTF): Model calculations have provided some plausible parameters for a field-reversed deuterium plasma in the MFTF, and a buildup calculation indicates that the MFTF neutral-beam system is marginally sufficient to achieve field reversal by neutral injection alone. However, the many uncertainties indicate the need for further research and development on alternate buildup methods. A discussion of experimental objectives is presented and important diagnostics are listed. The range of parameter space accessible with the MFTF magnet design is explored, and we find that with proper aiming of the neutral beams, meaningful experiments can be performed to advance toward these objectives. Finally, it is pointed out that if we achieve enhanced n tau confinement by means of field reversal, then quasi-steady-state operation of MFTF is conceivable

  20. Assessment of stability characteristics of MFTF coils

    International Nuclear Information System (INIS)

    1979-03-01

    Certain aspects of the MFTF (Mirror Fusion Test Facility) conductor performance were investigated. Recovery analysis of the MFTF conductor was studied using GA's stability code. The maximum length of uncooled, unsoldered composite core which can recover from a thermal excursion was determined analytically. A maximum credible mechanical disturbance in terms of energy deposition, conductor motion and length, and time duration, was postulated. 5 references, 4 figures

  1. MFTF-α + T shield design

    International Nuclear Information System (INIS)

    Gohar, Y.

    1985-01-01

    MFTF-α+T is a DT upgrade option of the Tandem Mirror Fusion Test Facility (MFTF-B) to study better plasma performance, and test tritium breeding blankets in an actual fusion reactor environment. The central cell insert, designated DT axicell, has a 2-MW/m 2 neutron wall loading at the first wall for blanket testing. This upgrade is completely shielded to protect the reactor components, the workers, and the general public from the radiation environment during operation and after shutdown. The shield design for this upgrade is the subject of this paper including the design criteria and the tradeoff studies to reduce the shield cost

  2. Low-level-signal data acquisition for the MFTF superconducting-magnet system

    International Nuclear Information System (INIS)

    Montoya, C.R.

    1981-01-01

    Acquisition of low level signals from sensors mounted on the superconducting yin-yang magnet in the Mirror Fusion Test Facility (MFTF) imposes very strict requirements on the magnet signal conditioning and data acquisition system. Of the various types of sensors required, thermocouples, strain gages, and voltage taps produce very low level outputs. These low level outputs must be accurately measured in the harsh environment of slowly varying magnetic fields, cryogenic temperatures, high vacuum, pulse power and 60 Hz electrical noise, possible neutron radiation, and high common mode voltage resulting from superconducting magnet quench. Successful measurements require careful attention to grounding, shielding, signal handling and processing in the data acquisition system. The magnet instrumentation system provides a means of effectively measuring both low level signals and high level signals from all types of sensors

  3. Axicell MFTF-B superconducting-magnet system

    International Nuclear Information System (INIS)

    Wang, S.T.; Bulmer, R.; Hanson, C.; Hinkle, R.; Kozman, T.; Shimer, D.; Tatro, R.; VanSant, J.; Wohlwend, J.

    1982-01-01

    The Axicell MFTF-B magnet system will provide the field environment necessary for tandem mirror plasma physics investigation with thermal barriers. The performance of the device will stimulate DT to achieve energy break-even plasma conditions. Operation will be with deuterium only. There will be 24 superconducting coils consisting of 2 sets of yin-yang pairs, 14 central-cell solenoids, 2 sets of axicell mirror-coil pairs, and 2 transition coils between the axicell mirror coil-pairs and the yin-yang coils. This paper describes the progress in the design and construction of MFTF-B Superconducting-Magnet System

  4. MFTF test coil construction and performance

    International Nuclear Information System (INIS)

    Cornish, D.N.; Zbasnik, J.P.; Leber, R.L.; Hirzel, D.G.; Johnston, J.E.; Rosdahl, A.R.

    1978-01-01

    A solenoid coil, 105 cm inside the 167 cm outside diameter, has been constructed and tested to study the performance of the stabilized Nb--Ti conductor to be used in the Mirror Fusion Test Facility (MFTF) being built at Lawrence Livermore Laboratory. The insulation system of the test coil is identical to that envisioned for MFTF. Cold-weld joints were made in the conductor at the start and finish of each layer; heaters were fitted to some of these joints and also to the conductor at various locations in the winding. This paper gives details of the construction of the coil and the results of the tests carried out to determine its propagation and recovery characteristics

  5. Performance of the MFTF magnet cryogenic power leads

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1983-01-01

    The cryogenic power lead system for the MFTF superconducting magnets has been acceptance tested and operated with the magnets. This system, which includes 5-m-long superconducting buses, 1.5-m-long vapor-cooled transition leads, external warm buses, and a cryostack, can conduct up to 6000 A (dc) and operate adiabatically for long periods. We present both design details and performance data; our MFTF version is an example of a reliable lead system for large superconducting magnets contained in a much larger vacuum vessel

  6. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  7. 1000 kW ICRH amplifiers for MFTF-B

    International Nuclear Information System (INIS)

    Boksberger, U.

    1986-01-01

    For the startup of the MFTF-B ICRH heating will be applied. Two commercial amplifiers derived from standard broadcast transmitters provide 1000 kW RF power each into a matching system for any VSWR as high as 1.5. Emphasis is put on the specific environment of magnetic fields and seismic loads as well as to the particular RF power control requirements and remote operation. Also addressed is the amplifier's performance into a typical load. The load variations due to the MFTF-B plasma coupling were calculated by TRW

  8. Design of the MFTF external vacuum system

    International Nuclear Information System (INIS)

    Holl, P.M.

    1979-01-01

    As a result of major experiment success in the LLL mirror program on start-up and stabilization of plasmas in minimum-B magnetic geometry, a Mirror Fusion Test Facility (MFTF) is under construction. Completion is scheduled for September, 1981. MFTF will be used to bridge the gap between present day small mirror experiments and future fusion-reactor activity based on magnetic mirrors. The focal point of the Mirror Fusion Test Facility is the 35 foot diameter by 60 foot long vacuum vessel which encloses the superconducting magnets. High vacuum conditions in the vessel are required to establish and maintain a plasma, and to create and deliver energetic neutral atoms to heat the plasma at the central region

  9. Neutral-beam aiming and calorimetry for MFTF-B

    International Nuclear Information System (INIS)

    Goldner, A.I.; Margolies, D.

    1981-01-01

    The vessel for the Tandem Mirror Fusion Test Facility (MFTF-B) will have up to eleven 0.5-s-duration neutral-beam injectors for the initial heating of the MFTF-B plasma. Knowing the exact alignment of the beams and their total power is critical to the performance of the experiment. Using prototype aiming and calorimetry systems on the High Voltage Test Stand (HVTS) at Lawrence Livermore National Laboratory (LLNL), we hope to prove our ability to obtain an aiming accuracy of +-1 cm at the plasma and a calorimetric accuracy of +-5% of the actual total beam energy

  10. Low level signal data acquisition for the MFTF-B superconducting magnet system

    International Nuclear Information System (INIS)

    Montoya, C.R.

    1984-01-01

    Acquisition of low level signals from sensors mounted on the superconducting magnets in the Tandem Mirror Fusion Test Facility (MFTF-B) impose very strict requirements on the magnet signal conditioning and data acquisition system. Of the various types of sensors required, thermocouples and strain gages produce very low level outputs. These low level outputs must be accurately measured in the harsh environment of slowly varying magnetic fields, cryogenic temperatures, high vacuum, 80 kV pulse power, 60 Hz, 17 MHz and 28, 35, and 56 GHz electrical noise and possible neutron radiation. Successful measurements require careful attention to grounding, shielding, signal handling and processing in the data acquisition system. The magnet instrumentation system provides a means of effectively measuring both low level signals and high level signals from all types of sensors. Various methods involved in the design and implementation of the system for signal conditioning and data gathering will be presented

  11. Liquid helium cooling of the MFTF superconducting magnets

    International Nuclear Information System (INIS)

    VanSant, J.H.; Zbasnik, J.P.

    1986-09-01

    During acceptance testing of the Mirror Fusion Test Facility (MFTF), we measured these tests: liquid helium heat loads and flow rates in selected magnets. We used the data from these tests to estimate helium vapor quality in the magnets so that we could determine if adequate conductor cooling conditions had occurred. We compared the measured quality and flow with estimates from a theoretical model developed for the MFTF magnets. The comparison is reasonably good, considering influences that can greatly affect these values. This paper describes the methods employed in making the measurements and developing the theoretical estimates. It also describes the helium system that maintained the magnets at required operating conditions

  12. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  13. Control and diagnostic data structures for the MFTF

    International Nuclear Information System (INIS)

    Wade, J.A.; Choy, J.H.

    1979-01-01

    A Data Base Management System (DBMS) is being written as an integral part of the Supervisory Control and Diagnostics System (SCDS) of programs for control of the Mirror Fusion Test Facility (MFTF). The data upon which the DBMS operates consist of control values and evaluative information required for facilities control, along with control values and disgnostic data acquired as a result of each MFTF shot. The user interface to the DBMS essentially consists of two views: a computer program interface called the Program Level Interface (PLI) and a stand-alone interactive program called the Query Level Interface to support terminal-based queries. This paper deals specifically with the data structure capabilities from the viewpoint of the PLI user

  14. MFTF magnet cryostability

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1979-01-01

    A pair of large superconducting magnets will be installed in the Mirror Fusion Test Facility (MFTF), which is to begin operation in 1981. To ensure a stable superconducting state for the niobium-titanium (Nb-Ti) conductor, special consideration has been given to certain aspects of the magnet system design. These include the conductor, joints, coil assembly, vapor plenums, liquid-helium (LHe) supply system, and current leads. Heat transfer is the main consideration; i.e., the helium quality and temperature are limited so that the superconductor will perform satisfactorily in the magnet environment

  15. Man-machine interface for the MFTF

    International Nuclear Information System (INIS)

    Speckert, G.C.

    1979-01-01

    In any complex system, the interesting problems occur at the interface of dissimilar subsystems. Control of the Mirror Fusion Test Facility (MFTF) begins with the US Congress, which controls the dollars, which control the people, who control the nine top-level minicomputers, which control the 65 microprocessors, which control the hardware that controls the physics experiment. There are many interesting boundaries across which control must pass, and the one that this paper addresses is the man-machine one. For the MFTF, the man-machine interface consists of a system of seven control consoles, each allowing one operator to communicate with one minicomputer. These consoles are arranged in a hierarchical manner, and both hardware and software were designed in a top-down fashion. This paper describes the requirements and the design of the console system as a whole, as well as the design and operation of the hardware and software of each console, and examines the possible form of a future man-machine interface

  16. Man-machine interface for the MFTF

    Energy Technology Data Exchange (ETDEWEB)

    Speckert, G.C.

    1979-11-09

    In any complex system, the interesting problems occur at the interface of dissimilar subsystems. Control of the Mirror Fusion Test Facility (MFTF) begins with the US Congress, which controls the dollars, which control the people, who control the nine top-level minicomputers, which control the 65 microprocessors, which control the hardware that controls the physics experiment. There are many interesting boundaries across which control must pass, and the one that this paper addresses is the man-machine one. For the MFTF, the man-machine interface consists of a system of seven control consoles, each allowing one operator to communicate with one minicomputer. These consoles are arranged in a hierarchical manner, and both hardware and software were designed in a top-down fashion. This paper describes the requirements and the design of the console system as a whole, as well as the design and operation of the hardware and software of each console, and examines the possible form of a future man-machine interface.

  17. Progress on axicell MFTF-B superconducting magnet systems

    International Nuclear Information System (INIS)

    Wang, S.T.; Kozman, T.A.; Hanson, C.L.; Shimer, D.W.; VanSant, J.H.; Zbasnik, J.

    1983-01-01

    Since the entire Mirror Fusion Test Facility (MFTF-B) Magnet System was reconfigured from the original A-cell to an axicell design, much progress has been made on the design, fabrication, and installation planning. The axicell MFTF-B magnet array consists of a total of 26 large superconducting main coils. This paper provides an engineering overview of the progress of these coils. Recent studies on the effects of field errors on the plasma at the recircularizing region (transition coils) show that small field errors will generate large displacements of the field lines. These field errors might enhance radial electron heat transport and deteriorate the plasma confinement. Therefore, 16 superconducting trim coils have been designed to correct the coil misalignments. Progress of the trim coils are reported also

  18. Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Thomassen, K.I.

    1978-01-01

    A large, new Mirror Fusion Test Facility is under construction at LLL. Begun in FY78 it will be completed at the end of FY78 at a cost of $94.2M. This facility gives the mirror program the flexibility to explore mirror confinement principles at a signficant scale and advances the technology of large reactor-like devices. The role of MFTF in the LLL program is described here

  19. Report on the engineering test of the LBL 30 second neutral beam source for the MFTF-B project

    International Nuclear Information System (INIS)

    Vella, M.C.; Pincosy, P.A.; Hauck, C.A.; Pyle, R.V.

    1984-08-01

    Positive ion based neutral beam development in the US has centered on the long pulse, Advanced Positive Ion Source (APIS). APIS eventually focused on development of 30 second sources for MFTF-B. The Engineering Test was part of competitive testing of the LBL and ORNL long pulse sources carried out for the MFTF-B Project. The test consisted of 500 beam shots with 80 kV, 30 second deuterium, and was carried out on the Neutral Beam Engineering Test Facility (NBETF). This report summarizes the results of LBL testing, in which the LBL APIS demonstrated that it would meet the requirements for MFTF-B 30 second sources. In part as a result of this test, the LBL design was found to be suitable as the baseline for a Common Long Pulse Source design for MFTF-B, TFTR, and Doublet Upgrade

  20. Physics basis for an axicell design for the end plugs of MFTF-B

    International Nuclear Information System (INIS)

    Baldwin, D.E.; Logan, B.G.

    1982-01-01

    The primary motivation for conversion of MFTF-B to an axicell configuration lies in its engineering promise as a reactor geometry based on circular high-magnetic-field coils. In comparing this configuration to the previous A-cell geometry, we find a number of differences that might significantly affect the physics performance. The purpose of the present document is to examine those features and to assess their impact on the performance of the axicell, as compared to the A-cell configuration, for MFTF-B. In so doing, we address only those issues thought to be affected by the change in geometry and refer to the original report Physics Basis for MFTF-B, for discussion of those issues thought not to be affected. In Sec. 1, we summarize these physics issues. In Sec. 2, we describe operating scenarios in the new configuration. In the Appendices, we discuss those physics issues that require more detailed treatment

  1. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  2. Quench Detection and Magnet Protection Study for MFTF. LLL final review

    International Nuclear Information System (INIS)

    1979-06-01

    The results of a Quench Detection and Magnet Protection Study for MFTF are summarized. The study was directed toward establishing requirements and guidelines for the electronic package used to protect the MFTF superconducting magnets. Two quench detection schemes were analyzed in detail, both of which require a programmable quench detector. Hardware and software recommendations for the quench detector were presented as well as criteria for dumping the magnet energy in the event of a quench. Overall magnet protection requirements were outlined in a detailed Failure Mode Effects and Criticality analysis, (FMECA). Hardware and software packages compatible with the FMECA were recommended, with the hardware consisting of flexible, dedicated intelligent modules specifically designed for magnet protection

  3. Structural design considerations in the Mirror Fusion Test Facility (MFTF-B) vacuum vessel

    International Nuclear Information System (INIS)

    Vepa, K.; Sterbentz, W.H.

    1981-01-01

    In view of favorable results from the Tandem Mirror Experiment (TMX) also at LLNL, the MFTF project is now being rescoped into a large tandem mirror configuration (MFTF-B), which is the mainline approach to a mirror fusion reactor. This paper concerns itself with the structural aspects of the design of the vessel. The vessel and its intended functions are described. The major structural design issues, especially those influenced by the analysis, are described. The objectives of the finite element analysis and their realization are discussed at length

  4. Model approach for simulating the thermodynamic behavior of the MFTF cryogenic cooling systems - a status report

    International Nuclear Information System (INIS)

    Sutton, S.B.; Stein, W.; Reitter, T.A.; Hindmarsh, A.C.

    1983-01-01

    A numerical model for calculating the thermodynamic behavior of the MFTF-B cryogenic cooling system is described. Nine component types are discussed with governing equations given. The algorithm for solving the coupled set of algebraic and ordinary differential equations is described. The model and its application to the MFTF-B cryogenic cooling system has not been possible due to lack of funding

  5. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Design and analysis summary. Volume 1

    International Nuclear Information System (INIS)

    Heathman, J.H.; Wohlwend, J.W.

    1985-05-01

    This report summarizes the designs and analyses produced by General Dynamics Convair for the four Axicell magnets (A1 and A20, east and west), the four Transition magnets (T1 and T2, east and west), and the twelve Solenoid magnets (S1 through S6, east and west). Over four million drawings and specifications, in addition to detailed stress analysis, thermal analysis, electrical, instrumentation, and verification test reports were produced as part of the MFTF-B design effort. Significant aspects of the designs, as well as key analysis results, are summarized in this report. In addition, drawing trees and lists off detailed analysis and test reports included in this report define the locations of the detailed design and analysis data

  6. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Design and analysis summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Heathman, J.H.; Wohlwend, J.W.

    1985-05-01

    This report summarizes the designs and analyses produced by General Dynamics Convair for the four Axicell magnets (A1 and A20, east and west), the four Transition magnets (T1 and T2, east and west), and the twelve Solenoid magnets (S1 through S6, east and west). Over four million drawings and specifications, in addition to detailed stress analysis, thermal analysis, electrical, instrumentation, and verification test reports were produced as part of the MFTF-B design effort. Significant aspects of the designs, as well as key analysis results, are summarized in this report. In addition, drawing trees and lists off detailed analysis and test reports included in this report define the locations of the detailed design and analysis data.

  7. Date base management system for the MFTF

    International Nuclear Information System (INIS)

    Choy, J.H.; Wade, J.A.

    1979-01-01

    The data base management system (DBMS) for the Mirror Fusion Test Facility (MFTF) is described as relational in nature and distributed across the nine computers of the supervisory control and diagnostics system. This paper deals with a reentrant runtime package of routines that are used to access data items, the data structures to support the runtime package, and some of the utilities in support of the DBMS

  8. Structural analysis interpretation task for the magnet system for Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Baldi, R.W.

    1979-11-01

    The primary objective of this study was to develop recommendations to improve and substantiate the structural integrity of the highly stresses small radius region of the MFTF magnet. The specific approach is outlined: (1) Extract detail stress/strain data from General Dynamics Convair Finite-Element Refinement Analysis. (2) Diagram local plate stress distribution and its relationship to the adjacent weldment. (3) Update the parametric fracture mechanics analysis using most recent MFTF related data developed by National Bureau of Standards. (4) Review sequence and assembly as modified by Chicago Bridge and Iron for adaptability to refinements. (5) Investigate the need for fillet radii weldments to reduce stress concentrations at critical corners. (6) Review quality assurance plan for adequacy to insure structural quality in the small radius region. (7) Review instrumentation plan for adequacy of structural diagnostics in small radius region. (8) Participate in planning a small-scale fatigue test program of a typical MFTF weldment

  9. MFTF-B PACE tests and final cost report

    International Nuclear Information System (INIS)

    Krause, K.H.; Kozman, T.A.; Smith, J.L.; Horan, R.J.

    1986-10-01

    The Mirror Fusion Test Facility (MFTF-B) construction project was successfully completed in February 1986, with the conclusion of the Plant and Capital Equipment (PACE) Tests. This series of tests, starting in September 1985 and running through February 1986, demonstrated the overall machine capabilities and special facilities accomplishments for the Mirror Fusion Test Facility Project

  10. Mechanical considerations for MFTF-B plasma-diagnostic system

    International Nuclear Information System (INIS)

    Thomas, S.R. Jr.; Wells, C.W.

    1981-01-01

    The reconfiguration of MFTF to a tandem mirror machine with thermal barriers has caused a significant expansion in the physical scope of plasma diagnostics. From a mechanical perspective, it complicates the plasma access, system interfaces, growth and environmental considerations. Conceptual designs characterize the general scope of the design and fabrication which remains to be done

  11. Dynamic testing of MFTF containment-vessel structural system

    International Nuclear Information System (INIS)

    Weaver, H.J.; McCallen, D.B.; Eli, M.W.

    1982-01-01

    Dynamic (modal) testing was performed on the Magnetic Fusion Test Facility (MFTF) containment vessel. The seismic design of this vessel was heavily dependent upon the value of structural damping used in the analysis. Typically for welded steel vessels, a value of 2 to 3% of critical is used. However, due to the large mass of the vessel and magnet supported inside, we felt that the interaction between the structure and its foundation would be enhanced. This would result in a larger value of damping because vibrational energy in the structure would be transferred through the foundation into the surrounding soil. The dynamic test performed on this structure (with the magnet in place) confirmed this later theory and resulted in damping values of approximately 4 to 5% for the whole body modes. This report presents a brief description of dynamic testing emphasizing the specific test procedure used on the MFTF-A system. It also presents an interpretation of the damping mechanisms observed (material and geometric) based upon the spatial characteristics of the modal parameters

  12. Review of MFTF yin-yang magnet displacement and magnetic field measurements and calculations

    International Nuclear Information System (INIS)

    Hanson, C.L.; Myall, J.O.; Wohlwend, J.W.

    1983-01-01

    During the recent testing of the MFTF yin-yang magnet, measurements of coil position, structural case strain, and magnetic field were made to verify calculated values. Measurements to detect magnet movement were taken throughout cooldown and during the operation of the magnet. The magnetic field at the mirror points was measured by Hall-effect probes. The magnet position, structural case strain, and magnetic field measurements indicated a reasonably close correlation with calculated values. Information obtained from the yin-yang test has been very useful in setting realistic mechanical alignment values for the new MFTF-B magnet system

  13. Review of MFTF yin-yang magnet displacement and magnetic field measurements and calculations

    International Nuclear Information System (INIS)

    Hanson, C.L.; Myall, J.O.; Wohlwend, J.W.

    1983-01-01

    During the recent testing of the MFTF yin-yang magnet, measurements of coil position, structural case strain, and magnetic field were made to verify calculated values. Measurements to detect magnet movement were taken throughout cooldown and during the operation of the magnet. The magnetic field at the mirror points was measured by Hall-effect probes. The magnet position, structural case strain, and magntic field measurements indicated a reasonably close correlation with calculated values. Information obtained from the yin-yang test has been very useful in setting realistic mechanical alignment values for the new MFTF-B magnet system

  14. D-T axicell magnet system for MFTF-α+T

    International Nuclear Information System (INIS)

    Srivastava, V.C.

    1983-01-01

    The configuration and design of the deuterium-tritium (D-T) axicell superconducting magnets for the Mirror Fusion Test Facility (MFTF-α+T) are described. The MFTF-α+T is an upgrade of the MFTF-B, with new end-plug magnets and a neutron-producing central D-T axicell section. The 4-m long axicell - its length defined by the 12-T peaks in the mirror field - is beam fueled and heated by two beam lines, each with four neutral beam injection ports. Two large superconducting coils (means diameter approx. 3.8 m) located at Z = +-2.40 m, in conjunction with a small copper coil located outside the test volume region, produce the 4.5-T mirror midplane field. This background field is augmented by two copper coils to create the 12-T peak mirror fields at Z = +-2 m. The central region of the axicell accommodates a 1-m-long, replaceable blanket test module. The length (4 m) of the axicell was chosen to provide relatively uniform neutron wall loading over the test module. In many respects, this axicell is less than full scale, but it could be viewed as a short section of a reactor, complete with the support systems and technologies associated with a mirror reactor. The peak field at the superconducting coils is 10.8 T. The coils employ hybrid superconducting winding - Nb 3 Sn conductor in the 8- to 12-T region and NbTi in the 0- to 8-T region. The winding is cryostable and is cooled by a 4.2 K liquid helium bath. The conductor design, the winding design, and the performance analyses for these superconducting coils are described

  15. Options for axisymmetric operation of MFTF-B

    International Nuclear Information System (INIS)

    Fenstermacher, M.E.; Devoto, R.S.; Thomassen, K.I.

    1986-01-01

    The flexibility of MFTF-B for axisymmetric experiments has been investigated. Interhcanging the axicell coils and increasing their separation results in an axisymmetric plug cell with 12:1 and 6:1 inner and outer mirror ratios, respectively. For axisymmetric operation, the sloshing-ion neutral beams, ECRH gyrotrons, and the pumping system would be moved to the axicell. Stabilization by E-rings could be explored in this configuration. With the addition of octopole magnets, off-axis multipole stabilization could also be tested. Operating points for octopole and E-ring-stabilized configurations with properties similar to those of the quadrupole MFTF-B, namely T/sub ic/ = 10 - 15 keV and n/sub c/ approx. = 3 x 10 13 cm -3 , have been obtained. Because of the negligible radial transport of central-cell ions, the required neutral-beam power in the central cell has been dramatically reduced. In addition, because MHD stabilization is achieved by off-axis hot electrons in both cases, much lower barrier beta is possible, which aids in reducing the barrier ECRH power. Total ECRH power in the end cell is projected to be approx. =1 MW. Possible operating points for both octopole and E-ring configurations are described along with the stability considerations involved

  16. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  17. Ion trajectories of the MFTF unshielded 80-keV neutral-beam sources

    International Nuclear Information System (INIS)

    Ling, R.C.; Bulmer, R.H.; Cutler, T.A.; Foote, J.H.; Horvath, J.A.

    1978-01-01

    The trajectories of ions from the Magnetic Fusion Test Facility (MFTF) 80-keV neutral-beam sources are calculated to obtain a preliminary understanding of the ion-beam paths and the magnitude of the power densities. This information will be needed for locating and designing thermal (kinetic-energy) absorbers for the ions. The calculations are made by employing a number of previously written computer codes. The TIBRO code is used to calculate the trajectories of the ions in the fringe magnetic field of the MFTF machine, which can operate with a center-field intensity of up to 2 T. The SAMPP code gives three-dimensional views of the ion beams for better visualization of the ion-beam paths. Also used are the codes MIG, XPICK, and MERGE, which were all previously written for manipulating data

  18. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  19. Testing of the MFTF magnets

    International Nuclear Information System (INIS)

    Kozman, T.A.; Chang, Y.; Dalder, E.N.C.

    1982-01-01

    This paper describes the cooldown and testing of the first yin-yang magnet for the Mirror Fusion Test Facility. The introduction describes the superconducting magnet; the rest of the paper explains the tests prior to and including magnet cooldown and final acceptance testing. The MFTF (originally MX) was proposed in 1976 and the project was funded for construction start in October 1977. Construction of the first large superconducting magnet set was completed in May 1981 and testing started shortly thereafter. The acceptance test procedures were reviewed in May 1981 and the cooldown and final acceptance test were done by the end of February 1982. During this acceptance testing the magnet achieved its full design current and field

  20. Central cell confinement in MFTF-B

    International Nuclear Information System (INIS)

    Jong, R.A.

    1981-01-01

    The point code TANDEM has been used to survey the range of plasma parameters which can be attained in MFTF-B. The code solves for the electron and ion densities and temperatures in the central cell, yin-yang, barrier, and A-cell regions as well as the plasma potential in each region. In these studies, the A-cell sloshing ion beams were fixed while the neutral beams in the yin-yang and central cell, the gas feed in the central cell, and the applied ECRH power β, central cell ion density and temperature, and the confining potential are discussed

  1. Axicell design for the end plugs of MFTF-B

    International Nuclear Information System (INIS)

    Thomassen, K.I.; Karpenko, V.N.

    1982-01-01

    Certain changes in the end-plug design in the Mirror Fusion Test Facility (MFTF-B) are described. The Laboratory (LLNL) proposes to implement these changes as soon as possible in order to construct the machine in an axicell configuration. The present physics and technology goals as well as the project cost and schedule will not be affected by these changes

  2. Supervisory control software for MFTF neutral beams

    International Nuclear Information System (INIS)

    Woodruff, J.P.

    1981-01-01

    We describe the software structures that control the operation of MFTF Sustaining Neutral Beam Power Supplies (SNBPS). These components of the Supervisory Control and Diagnostics System (SCDS) comprise ten distinct tasks that exist in the SCDS system environment. The codes total about 16,000 lines of commented Pascal code and occupy 240 kbytes of memory. The controls have been running since March 1981, and at this writing are being integrated to the Local Control System and to the power supply Pulse Power Module Controller

  3. User interface on networked workstations for MFTF plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Renbarger, V.L.; Balch, T.R.

    1985-01-01

    A network of Sun-2/170 workstations is used to provide an interface to the MFTF-B Plasma Diagnostics System at Lawrence Livermore National Laboratory. The Plasma Diagnostics System (PDS) is responsible for control of MFTF-B plasma diagnostic instrumentation. An EtherNet Local Area Network links the workstations to a central multiprocessing system which furnishes data processing, data storage and control services for PDS. These workstations permit a physicist to command data acquisition, data processing, instrument control, and display of results. The interface is implemented as a metaphorical desktop, which helps the operator form a mental model of how the system works. As on a real desktop, functions are provided by sheets of paper (windows on a CRT screen) called worksheets. The worksheets may be invoked by pop-up menus and may be manipulated with a mouse. These worksheets are actually tasks that communicate with other tasks running in the central computer system. By making entries in the appropriate worksheet, a physicist may specify data acquisition or processing, control a diagnostic, or view a result

  4. Physics conceptual design for the MFTF-B transition coil

    International Nuclear Information System (INIS)

    Baldwin, D.E.; Bulmer, R.H.

    1982-01-01

    The physics constraints related to finite-β equilibria, β limits due to curvature-driven MHD modes, and ion transport in the central cell. These physics constraints had to be satisfied subject to certain non-physics constraints. Principal among these were the geometric and structural features of the existing MFTF-B magnet set and the required access for neutral beams for pumping. These constraints and their origins are discussed

  5. Directions for possible upgrades of the Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Damm, C.C.; Coensgen, F.H.; Devoto, R.S.; Molvik, A.W.; Porter, G.D.; Shearer, J.W.; Stallard, B.W.

    1977-01-01

    The Mirror Fusion Test Facility (MFTF) may be upgraded by extending the time of plasma sustenance in an approach to steady-state operation and/or by increasing the neutral-beam injection energy. Some parameter bounds for these upgrades are discussed as they relate to a definition of the required neutral-beam development

  6. Local area network for the plasma diagnostics system of MFTF-B

    International Nuclear Information System (INIS)

    Lau, N.H.; Minor, E.G.

    1983-01-01

    The MFTF-B Plasma Diagnostics System will be implemented in stages, beginning with a start-up set of diagnostics and evolving toward a basic set. The start-up set contains 12 diagnostics which will acquire a total of about 800 Kbytes of data per machine pulse; the basic set contains 23 diagnostics which will acquire a total of about 8 Mbytes of data per pulse. Each diagnostic is controlled by a Foundation System consisting of a DEC LSI-11/23 microcomputer connected to CAMAC via a 5 Mbits/second serial fiber-optic link and connected to a supervisory computer (Perkin-Elmer 3250) via a 9600 baud RS232 link. The Foundation System is a building block used throughout MFTF-B for control and status monitoring. However, its 9600 baud link to the supervisor presents a bottleneck for the large data transfers required by diagnostics. To overcome this bottleneck the diagnostics Foundation Systems will be connected together with an additional LSI-11/23 called the master to form a Local Area Network (LAN) for data acquisition

  7. Changing MFTF vacuum environment

    International Nuclear Information System (INIS)

    Margolies, D.; Valby, L.

    1982-12-01

    The Mirror Fusion Test Facility (MFTF) vacuum vessel will be about 60m long and 10m in diameter at the widest point. The allowable operating densities range from 2 x 10 9 to 5 x 10 10 particles per cc. The maximum leak rate of 10 - 6 tl/sec is dominated during operation by the deliberately injected cold gas of 250 tl/sec. This gas is pumped by over 1000 square meters of cryopanels, external sorption pumps and getters. The design and requirements have changed radically over the past several years, and they are still not in final form. The vacuum system design has also changed, but more slowly and less radically. This paper discusses the engineering effort necessary to meet these stringent and changing requirements. Much of the analysis of the internal systems has been carried out using a 3-D Monte Carlo computer code, which can estimate time dependent operational pressures. This code and its use will also be described

  8. Changing MFTF vacuum environment

    International Nuclear Information System (INIS)

    Margolies, D.; Valby, L.

    1982-01-01

    The Mirror Fusion Test Facility (MFTF) vaccum vessel will be about 60m long and 10m in diameter at the widest point. The allowable operating densities range from 2 x 10 9 to 5 x 10 10 particles per cc. The maximum leak rate of 10 -6 tl/sec is dominated during operation by the deliberately injected cold gas of 250 tl/sec. This gas is pumped by over 1000 square meters of cryopanels, external sorbtion pumps and getters. The design and requirements have changed radically over the past several years, and they are still not in final form. The vacuum system design has also changed, but more slowly and less radically. This paper discusses the engineering effort necessary to meet these stringent and changing requirements. Much of the analysis of the internal systems has been carried out using a 3-D Monte Carlo computer code, which can estimate time dependent operational pressures. This code and its use will also be described

  9. Drift orbits in the TMX and MFTF-B tandem mirrors

    International Nuclear Information System (INIS)

    Byers, J.A.

    1982-01-01

    Drift orbits for the TMX and MFTF-B tandem-mirror designs are followed by using a long-thin expansion of the drift equations. Unexpected asymmetries in the field-line curvatures in the yin-yang end-mirror traps, caused by the transition coils between the solenoid and the yin-yang, result in an elliptical distortion of the drift surface with a/b=1.5 at most, a perhaps tolerable deviation from omnigenity. Yushmanov-trapped particles are no worse than the bulk hot particles. Finite-beta plasma fields, coupled to the asymmetric curvature, produce sizeable banana orbits with widths comparable to the plasma radius, but these orbits are possible for only a few of the particles. Details of the transition through resonance in the solenoid are shown, including the banana shapes of the drift surfaces and the disruption of the surface in the stochastic regime. The orbits in the original design for the A-cell of MFTF-B are the most extreme; in the vacuum fields they all have an extended peanut shape that finally closes only at about 3m. This shape is strongly non-omnigenous and suggests a hollow plasma-density profile. Finite-beta B vectorxnablaB drifts can help to minimize the radial extent of these orbits, but the strength of the vacuum curvatures makes omnigenity only marginally possible. Including B vectorxnablaphi drifts makes omnigenity even more unlikely for the ions, for which the B vectorxnablaB and B vectorxnablaphi drifts are of opposite sign, and conversely helps to omnigenize the drift surfaces of the ECRH 200-keV electrons. It is argued that not every class of particles can have good, i.e. near-omnigenous drifts, regardless of the ability of phi(r) to adjust to limit the radial extent of the orbits. This lack of omnigenity leaves one with no theoretical base for describing the MHD equilibrium in the original designs, but a new magnetic field design for MFTF-B A-cell has apparently completely restored omnigenous orbits. (author)

  10. Manufacturing and quality assurance for the MFTF superconductor core

    International Nuclear Information System (INIS)

    Scanlan, R.M.; Johnston, J.E.; Waide, P.A.; Zeitlin, B.A.; Smith, G.B.; Nelson, C.T.

    1979-01-01

    A total of 55,000 m of multifilamentary Nb-Ti superconductor in minimum lengths of 380 m are required for the Mirror Fusion Test Facility. This conductor is a large cross-section monolith and, as such, has presented several new manufacturing challenges. In addition, a monolith requires more stringent quality assurance procedures than braids or cables. This paper describes the manufacturing steps and the quality assurance program which have been developed for the MFTF superconductor core

  11. Startup experience with the MFTF-B ECRH 100 kV dc power supply

    International Nuclear Information System (INIS)

    Bishop, S.R.; Goodman, R.A.; Wilson, J.H.

    1983-01-01

    One of the 24 Accel dc Power Supplies (ADCPS) originally intended for the Mirror Fusion Test Facility (MFTF-B) Neutral Beam Power Supply (NBPS) System has been converted to provide negative polarity output at 90 kV with a load current of 64 A dc. The load duty cycle is a pulse of 30-seconds duration with a pulse repetition period of five minutes. A new control system has been built which will serve as a prototype for the MFTF-B ADCPS controls, and a test setup was built which will be used to test the ADCPS. The Electron Cyclotron Resonance Heating (ECRH) dc Power Supply (DCPS) has been tested under both no-load and dummy-load conditions, under remote control, without notable problems. Test results indicate that the power supply should be reliable and safe to operate, and will meet the load duty requirements

  12. Startup experience with the MFTF-B ECRH 100 kV dc power supply

    International Nuclear Information System (INIS)

    Bishop, S.R.; Goodman, R.A.; Wilson, J.H.

    1983-01-01

    One of the 24 Accel DC Power Supplies (ADCPS) originally intended for the Mirror Fusion Test Facility (MFTF-B) Neutral Beam Power Supply (NBPS) System has been converted to provide negative polarity output at 90 kV with a load current of 64 A dc. The load duty cycle is a pulse of 30-seconds duration with a pulse repetition period of five minutes. A new control system has been built which will serve as a prototype for the MFTF-B ADCPS controls, and a test setup was built which will be used to test the ADCPS. The Electron Cyclotron Resonance Heating (ECRH) DC Power Supply (DCPS) has been tested under both no-load and dummy-load conditions, under remote control, without notable problems. Test results indicate that the power supply should be reliable and safe to operate, and will meet the load duty requirements

  13. Maintenance and availability considerations for MFTF-B upgrade

    International Nuclear Information System (INIS)

    Spampinato, P.T.

    1983-01-01

    The upgrade of the Mirror Fusion Test Facility (MFTF-B) tandem mirror device incorporates the operation of advanced systems plus the requirement for remote maintenance. To determine if the operating availability goal of this device is achievable, an assessment of component lifetimes was made, along with estimates of device downtime. Key subsystem components were considered from the magnet, heating, impurity control, pumping, and test module systems. Component replacements were grouped into three categories, and a lifetime operating plan, including component replacements, was developed. It was determined that this device could achieve a 10% operating availability

  14. Electrical supply for MFTF-B superconducting magnet system

    International Nuclear Information System (INIS)

    Shimer, D.W.; Owen, E.W.

    1985-01-01

    The MFTF-B magnet system consists of 42 superconducting magnets which must operate continuously for long periods of time. The magnet power supply system is designed to meet the operational requirements of accuracy, flexibility, and reliability. The superconducting magnets require a protection system to protect against critical magnet faults of quench, current lead overtemperature, and overcurrent. The protection system is complex because of the large number of magnets, the strong coupling between magnets, and the high reliability requirement. This paper describes the power circuits and the components used in the design

  15. Display-management system for MFTF

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is controlled by 65 local control microcomputers which are supervised by a local network of nine 32-bit minicomputers. Associated with seven of the nine computers are state-of-the-art graphics devices, each with extensive local processing capability. These devices provide the means for an operator to interact with the control software running on the minicomputers. It is critical that the information the operator views accurately reflects the current state of the experiment. This information is integrated into dynamically changing pictures called displays. The primary organizational component of the display system is the software-addressable segment. The segments created by the display creation software are managed by display managers associated with each graphics device. Each display manager uses sophisticated storage management mechanisms to keep the proper segments resident in the local graphics device storage

  16. Seismic analysis of the MFTF facility

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.

    1985-01-01

    Seismic analyses were performed on the Mirror Fusion Test Facility (MFTF-B) located at the Lawrence Livermore National Laboratory, Livermore, CA. The three major structures studied were the vacuum vessel, the concrete shielding vault, and the steel frame enclosure building. The analyses performed on these structures ranged from fixed-base response spectrum analyses to soil-structure interaction analyses including the effects of structure-to-structure interaction and foundation flexibility. The results of these studies showed that the presence of the vault significantly affects the response of the vessel; that modeling the flexibility of the vault footing is important when studying forces near the base of the wall; and that the vault had very little effect on the building response. (orig.)

  17. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  18. Design features of the solenoid magnets for the central cell of the MFTF-B

    International Nuclear Information System (INIS)

    Wohlwend, J.W.; Tatro, R.E.; Ring, D.S.

    1981-01-01

    The 14 superconducting solenoid magnets which form the central cell of the MFTF-B are being designed and fabricated by General Dynamics for the Lawrence Livermore National Laboratory. Each solenoid coil has a mean diameter of five meters and contains 600 turns of a proven conductor type. Structural loading resulting from credible fault events, cooldown and warmup requirements, and manufacturing processes consistent with other MFTF-B magnets have been considered in the selection of 304 LN as the structural material for the magnet. The solenoid magnets are connected by 24 intercoil beams and 20 solid struts which resist the longitudinal seismic and electromagnetic attractive forces and by 24 hanger/side supports which react magnet dead weight and seismic loads. A modular arrangement of two solenoid coils within a vacuum vessel segment allow for sequential checkout and installation

  19. Design of a magnetic field alignment diagnostic for the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Deadrick, F.J.; House, P.A.; Frye, R.W.

    1985-01-01

    Magnet alignment in tandem mirror fusion machines plays a crucial role in achieving and maintaining plasma confinement. Various visual alignment tools have been described by Post et al. to align the Tara magnet system. We have designed and installed a remotely operated magnetic field alignment (MFA) diagnostic system as a part of the Mirror Fusion Test Facility (MFTF-B). It measures critical magnetic field alignment parameters of the MFTF-B coil set while under full-field operating conditions. The MFA diagnostic employs a pair of low-energy, electron beam guns on a remotely positionable probe to trace and map selected magnetic field lines. An array of precision electrical detector paddles locates the position of the electron beam, and thus the magnetic field line, at several critical points. The measurements provide a means to compute proper compensating currents to correct for mechanical misalignments of the magnets with auxiliary trim coils if necessary. This paper describes both the mechanical and electrical design of the MFA diagnostic hardware

  20. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.L.; Kobayashi, A.

    1986-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. This paper describes the details of the spreadsheets and the implementation experience

  1. A user interface on networked workstations for MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Balch, T.R.; Renbarger, V.L.

    1986-01-01

    A network of Sun-2/170 workstations is used to provide an interface to the MFTF-B Plasma Diagnostics System at Lawrence Livermore National Laboratory. The Plasma Diagnostics System (PDS) is responsible for control of MFTF-B plasma diagnostic instrumentation. An EtherNet Local Area Network links the workstations to a central multiprocessing system which furnishes data processing, data storage and control services for PDS. These workstations permit a physicist to command data acquisition, data processing, instrument control, and display of results. The interface is implemented as a metaphorical desktop, which helps the operator form a mental model of how the system works. As on a real desktop, functions are provided by sheets of paper (windows on a CRT screen) called worksheets. The worksheets may be invoked by pop-up menus and may be manipulated with a mouse. These worksheets are actually tasks that communicate with other tasks running in the central computer system. By making entries in the appropriate worksheet, a physicist may specify data acquisition or processing, control a diagnostic, or view a result

  2. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.; Kobayashi, A.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. We report here details of our spreadsheets and our implementation experience

  3. Alternatives for contaminant control during MFTF plasma buildup

    International Nuclear Information System (INIS)

    Khan, J.M.; Valby, L.E.

    1979-01-01

    The MFTF mirror device considers all low-energy species to be contaminants, since their primary effect is to erode the plasma boundary by charge-exchange reactions. Confinement for other than hydrogen isotypes is far from complete and confinement time is hardly more than transit time from the source to the end wall. The brevity of the confinement time makes it all the more necessary to prevent any contamination which might further reduce it. At Livermore, the historical solution to contaminant control has been to evaporate titanium onto cold surfaces. An alternative to this approach and its implications are considered

  4. New kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Saroyan, R.A.; Mead, J.E.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  5. MFTF 230 kV pulsed power substation

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1979-01-01

    The Mirror Fusion Test Facility (MFTF) currently under construction at the Lawrence Livermore Laboratory includes a Sustaining Neutral Beam Power Supply System (SNBPSS) consisting of 24 power-supply sets. The System will operate in long pulses (initially .5 seconds and eventually 30 seconds) at high power (200 MW), which will necessitate a large source of ac power. To meet this requirement, a new 230-kV substation is also being built at LLL. The constraints of cost, equipment protection, short operating lifetime (10 years), and reliability dictated a unique substation design. Its unusual features include provisions for fast fault detection and tripping, a capability for limiting ground fault current, low impedance, and economical design

  6. Confirmatory analysis and detail design of the magnet system for mirror fusion test facility (MFTF)

    International Nuclear Information System (INIS)

    Tatro, R.E.; Baldi, R.W.

    1978-10-01

    This summary covers the six individual reports delivered to the LLL MFTF program staff. They are: (1) literature survey (helium heat transfer), (2) thermodynamic analysis, (3) structural analysis, (4) manufacturing/producibility study, (5) instrumentation plan and (6) quality assurance report

  7. Design and fabrication of the superconducting-magnet system for the Mirror Fusion Test Facility (MFTF-B)

    International Nuclear Information System (INIS)

    Tatro, R.E.; Wohlwend, J.W.; Kozman, T.A.

    1982-01-01

    The superconducting magnet system for the Mirror Fusion Test Facility (MFTF-B) consists of 24 magnets; i.e. two pairs of C-shaped Yin-Yang coils, four C-shaped transition coils, four solenoidal axicell coils, and a 12-solenoid central cell. General Dynamics Convair Division has designed all the coils and is responsible for fabricating 20 coils. The two Yin-Yang pairs (four coils) are being fabricated by the Lawrence Livermore National Laboratory. Since MFTF-B is not a magnet development program, but rather a major physics experiment critical to the mirror fusion program, the basic philosophy has been to use proven materials and analytical techniques wherever possible. The transition and axicell coils are currently being analyzed and designed, while fabrication is under way on the solenoid magnets

  8. The local area network for the plasma Diagnostics System of MFTF-B

    International Nuclear Information System (INIS)

    Lau, N.H.; Minor, E.G.

    1983-01-01

    The MFTF-B Plasma Diagnostics System will be implemented in stages, beginning with a start-up set of diagnostics and evolving toward a basic set. The start-up set contains 12 diagnostics which will acquire a total of about 800 Kbytes of data per machine pulse; the basic set contains 23 diagnostics which will acquire a total of about 8 Mbytes of data per pulse. Each diagnostic is controlled by a ''Foundation System'' consisting of a DEC LSI-11/23 microcomputer connected to CAMAC via a 5 Mbits/second serial fiber-optic link and connected to a supervisory computer (Perkin-Elmer 3250) via a 9600 baud RS232 link. The Foundation System is a building block used throughout MFTF-B for control and status monitoring. However, its 9600 baud link to the supervisor presents a bottleneck for the large data transfers required by diagnostics. To overcome this bottleneck the diagnostics Foundation Systems will be connected together with an additional LSI-11/23 called the ''master'' to form a Local Area Network (LAN) for data acquisition. The Diagnostics LAN has a ring architecture with token passing arbitration

  9. Computer language evaluation for MFTF SCDS

    International Nuclear Information System (INIS)

    Anderson, R.E.; McGoldrick, P.R.; Wyman, R.H.

    1979-01-01

    The computer languages available for the systems and application implementation on the Supervisory Control and Diagnostics System (SCDS) for the Mirror Fusion Test Facility (MFTF) were surveyed and evaluated. Four language processors, CAL (Common Assembly Language), Extended FORTRAN, CORAL 66, and Sequential Pascal (SPASCAL, a subset of Concurrent Pascal [CPASCAL]) are commercially available for the Interdata 7/32 and 8/32 computers that constitute the SCDS. Of these, the Sequential Pascal available from Kansas State University appears best for the job in terms of minimizing the implementation time, debugging time, and maintenance time. This improvement in programming productivity is due to the availability of a high-level, block-structured language that includes many compile-time and run-time checks to detect errors. In addition, the advanced data-types in language allow easy description of the program variables. 1 table

  10. Report on the experience with the Supervisory Control and Diagnostics System (SCDS) of MFTF-B

    International Nuclear Information System (INIS)

    Wyman, R.H.

    1983-01-01

    The Supervisory Control and Diagnostics System (SCDS) of MFTF is a multiprocessor computer system using graphics oriented displays with touch sensitive panels as the primary operator interface. Late in the calendar year 1981 the system was used to control an integrated test of the vacuum vessel, vacuum system, cryogenics system and the superconducting magnet of MFTF. Since the completion of those tests and starting in early calendar 1983 the system has been used for control of the neutral beam test facility at LLNL. This paper presents a short overview of SCDS for the purpose of orientation and then proceeds to describe the difficulties encountered in these preliminary encounters with reality. The band-aids used to hold things together as disaster threatened as well as the long-term solutions to the problems will be discussed. Finally, we will present some comments on system costs and management philosophy

  11. Computer circuit analysis of induced currents in the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Magnuson, G.D.; Woods, E.L.

    1981-01-01

    An analysis was made of the induced current behavior of the MFTF-B magnet system. Although the magnet system consists of 22 coils, because of its symmetry we considered only 11 coils in the analysis. Various combinations of the coils were dumped either singly or in groups, with the current behavior in all magnets calculated as a function of time after initiation of the dump

  12. A new kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.; Mead, J.; Saroyan, R.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  13. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.

    2017-03-01

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  14. Thermal control for the MFTF magnet

    International Nuclear Information System (INIS)

    Vansant, J.H.; Russ, R.M.

    1980-01-01

    The external dimensions of the Yin-Yang magnet of the Mirror Fusion Test Facility will be 7.8 by 8.5 by 8.5 m, and it will weigh approximately 300 tons. More than 8000 liters of circulating liquid helium will be required to maintain the nearly 50 km of superconductor at below 5.0 K while the latter carries almost 6000 A in a magnetic field of up to nearly 7.7 T. This paper describes several features of the thermal control plans for the Yin-Yang: (1) the proposed cooldown and warmup schedules for the MFTF and the procedure for regenerating external cooling surfaces (2) the design of an external quench resistor based on an estimate of the superconductor's maximum temperature and (3) the use of a computer model of liquid helium circulation in choosing pipe size for the liquid helium lines

  15. MFTF plasma diagnostics data acquisition system

    International Nuclear Information System (INIS)

    Davis, G.E.; Coffield, F.E.

    1979-01-01

    The initial goal of the Data Acquisition System (DAS) is to control 11 instruments chosen as the startup diagnostic set and to collect, process, and display the data that these instruments produce. These instruments are described in a paper by Stan Thomas, et. al. entitled ''MFTF Plasma Diagnostics System.'' The DAS must be modular and flexible enough to allow upgrades in the quantity of data taken by an instrument, and also to allow new instruments to be added to the system. This is particularly necessary to support a research project where needs and requirements may change rapidly as a result of experimental findings. Typically, the startup configuration of the diagnostic instruments will contain only a fraction of the planned detectors, and produce approximately one half the data that the expanded version is designed to generate. Expansion of the system will occur in fiscal year 1982

  16. Improvement in MFTF data base system response times

    International Nuclear Information System (INIS)

    Lang, N.C.; Nelson, B.C.

    1983-01-01

    The Supervisory Control and Diagnostic System for the Mirror Fusion Test Facility (MFTF) has been designed as an event driven system. To this end we have designed a data base notification facility in which a task can request that it be loaded and started whenever an element in the data base is changed beyond some user defined range. Our initial implementation of the notify facility exhibited marginal response times whenever a data base table with a large number of outstanding notifies was written into. In this paper we discuss the sources of the slow response and describe in detail a new structure for the list of notifies which minimizes search time resulting in significantly faster response

  17. Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.

    Science.gov (United States)

    Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M

    2013-05-21

    This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.

  18. Plasma modeling of MFTF-B and the sensitivity to vacuum conditions

    International Nuclear Information System (INIS)

    Porter, G.D.; Rensink, M.

    1984-01-01

    The Mirror Fusion Test Facility (MFTF-B) is a large tandem mirror device currently under construction at Lawrence Livermore National Laboratory. The completed facility will consist of a large variety of components. Specifically, the vacuum vessel that houses the magnetic coils is basically a cylindrical vessel 60 m long and 11 m in diameter. The magnetics system consists of some 28 superconducting coils, each of which is located within the main vacuum vessel. Twenty of these coils are relatively simple solenoidal coils, but the remaining eight are of a more complicated design to provide an octupole component to certain regions of the magnetic field. The vacuum system is composed of a rough vacuum chain, used to evacuate the vessel from atmospheric pressure, and a high vacuum system, used to maintain good vacuum conditions during a plasma shot. High vacuum pumping is accomplished primarily by cryogenic panels cooled to 4.5 0 K. The MFTF-B coil set is shown together with typical axial profiles of magnetic field (a), electrostatic potential (b), and plasma density (c). The plasma is divided into nine regions axially, as labelled on the coil set in Figure 1. The central cell, which is completely azimuthally symmetric, contains a large volume plasma that is confined by a combination of the magnetic fields and the electrostatic potentials in the yin-yang cell

  19. Overview of the data acquisition and control system for plasma diagnostics on MFTF-B

    International Nuclear Information System (INIS)

    Wyman, R.H.; Deadrick, F.J.; Lau, N.H.; Nelson, B.C.; Preckshot, G.G.; Throop, A.L.

    1983-01-01

    For MFTF-B, the plasma diagnostics system is expected to grow from a collection of 12 types of diagnostic instruments, initially producing about 1 Megabyte of data per shot, to an expanded set of 22 diagnostics producing about 8 Megabytes of data per shot. To control these diagnostics and acquire and process the data, a system design has been developed which uses an architecture similar to the supervisory/local-control computer system which is used to control other MFTF-B subsystems. This paper presents an overview of the hardware and software that will control and acquire data from the plasma diagnostics system. Data flow paths from the instruments, through processing, and into final archived storage will be described. A discussion of anticipated data rates, including anticipated software overhead at various points of the system, is included, along with the identification of possible bottlenecks. A methodology for processing of the data is described, along with the approach to handle the planned growth in the diagnostic system. Motivations are presented for various design choices which have been made

  20. Design features of the A-cell and transition coils of MFTF-B

    International Nuclear Information System (INIS)

    Tatro, R.E.; Wohlwend, J.W.; Ring, D.S.

    1981-01-01

    The MFTF-B transition coil and A-cell magnet designs use variations of the copper-stabilized NbTi conductor developed by LLNL for the MFTF Yin-Yang magnets. This conductor will be wound on the one inch thick (12.7 mm) stainless steel coil forms using a two-axis winding machine similar to the existing LLNL Yin-Yang winding machine. After winding, covers will be placed over the coil and welded to the coil form to form a helium-tight jacket around the conductor. These jacketed coils are then enclosed in thick structural cases that react the large Lorentz forces on the magnets. The space between the coil jacket and case will be filled by a stainless steel bladder that will be injected with urethane. The injection bladder will provide cooling passages during cooldown as well as transmitting the Lorentz forces between the jacket and the case. The large self-equilibrating lobe-spreading forces on the magnets (29.10 6 lb, 127.0 MN) for the A-cell are reacted primarily through the thick 304 LN case into the external superstructure. The net Lorentz forces and the inertial forces on the magnet are reacted through support systems into the LLNL vacuum vessel structure

  1. Results of studies performed on the model of the MFTF Supervisory Control and Diagnostics System (SCDS)

    International Nuclear Information System (INIS)

    Wyman, R.H.

    1979-01-01

    The design and implementation of the SCDS is a relatively complex problem involving a nine-computer network coupled with a unique color graphics control console system, 50 local control minicomputers, and the usual array of drives, printers, magnetic tapes, etc. Four million bytes of data are to be collected on each MFTF cycle with a repetition rate of five minutes per shot, and the associated data processing and storing load is a major concern. Crude paper studies were made initially to try to size the various components of the system and various configurations were proposed and analyzed prior to the solicitation for the computer system. However, once the hardware was purchased and a preliminary software design was completed, it became essential and feasible to do an analysis of the system to considerably greater depth in order to identify bottlenecks and other system problems and to verify those parts of the design that met the MFTF requirements

  2. Ion cyclotron resonance heating (ICRH) start-up antenna for the mirror fusion test facility (MFTF-B)

    International Nuclear Information System (INIS)

    McCarville, T.M.; Romesser, T.E.

    1985-01-01

    The purpose of the ICRH start-up antenna on MFTF-B is to heat the plasma and control the ion distribution as the density increases during start-up. The antenna, consisting of two center fed half turn loops phased 180 0 apart, has been designed for 1 MW of input power, with a goal of coupling 400 kW into the ions. To vary the heating frequency relative to the local ion cyclotron frequency, the antenna is tunable over a range from 7.5 to 12.5 MHz. The thermal requirements common to low duty cycle ICRH antennas are especially severe for the MFTF-B antenna. The stress requirements are also unique, deriving from the possibility of seismic activity or JxB forces if the magnets unexpectedly quench. Considerable attention has been paid to contact control at high current bolt-up joints, and arranging geometries so as to minimize the possibility of voltage breakdown

  3. Estimation of neutral-beam-induced field reversal in MFTF by an approximate scaling law

    International Nuclear Information System (INIS)

    Shearer, J.W.

    1980-01-01

    Scaling rules are derived for field-reversed plasmas whose dimensions are common multiples of the ion gyroradius in the vacuum field. These rules are then applied to the tandem MFTF configuration, and it is shown that field reversal appears to be possible for neutral beam currents of the order of 150 amperes, provided that the electron temperature is at least 500 eV

  4. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  5. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Manufacturing/producibility final report. Volume 2

    International Nuclear Information System (INIS)

    Ritschel, A.J.; White, W.L.

    1985-05-01

    This Final MFTF-B Manufacturing/Producibility Report covers facilities, tooling plan, manufacturing sequence, schedule and performance, producibility, and lessons learned for the solenoid, axicell, and transition coils, as well as a deactivation plan, conclusions, references, and appendices

  6. Protection of the MFTF accel power supplies

    International Nuclear Information System (INIS)

    Wilson, J.H.; Wood, J.C.

    1979-01-01

    The MFTF experiment's Sustaining Neutral Beam Power Supply System (SNBPSS) includes twenty-four 95 kV, 80 A accel dc power supplies (ADCPS). Each power supply includes a relatively high-impedance (20 percent) rectifier transformer and a step voltage regulator with a 50-100 percent voltage range. With this combination, the fault current for some postulated faults may be lower than the supply's full load current at maximum voltage. A design has been developed which uses protective relays and current-limiting fuses coordinated to detect phase and ground faults, DC faults, incorrect voltage conditions, rectifier faults, power factor correction capacitor faults, and overloads. This unusual solution ensures fast tripping on potentially destructive high-current faults and long-time delays at lower currents to allow 30 second pulse operation. The ADCPS meets the LLL specification that all major assemblies be self-protecting, that is, able to sustain external faults without damage to minimize damage due to internal faults

  7. MFTF supervisory control and diagnostics system hardware

    International Nuclear Information System (INIS)

    Butner, D.N.

    1979-01-01

    The Supervisory Control and Diagnostics System (SCDS) for the Mirror Fusion Test Facility (MFTF) is a multiprocessor minicomputer system designed so that for most single-point failures, the hardware may be quickly reconfigured to provide continued operation of the experiment. The system is made up of nine Perkin-Elmer computers - a mixture of 8/32's and 7/32's. Each computer has ports on a shared memory system consisting of two independent shared memory modules. Each processor can signal other processors through hardware external to the shared memory. The system communicates with the Local Control and Instrumentation System, which consists of approximately 65 microprocessors. Each of the six system processors has facilities for communicating with a group of microprocessors; the groups consist of from four to 24 microprocessors. There are hardware switches so that if an SCDS processor communicating with a group of microprocessors fails, another SCDS processor takes over the communication

  8. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  9. Design and fabrication of the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Tatro, R.E.; Kozman, T.A.

    1985-09-01

    The MFTF-B superconducting magnet system consists of 40 NbTi magnets and two Nb 3 Sn magnets. General Dynamics (GD) designed all magnets except for the small trim coils. GD then fabricated 20 NbTi magnets, while LLNL fabricated 20 NbTi magnets and two Nb 3 Sn magnets. The design phase was completed in February 1984 and included the competitive procurement of magnet structural fabrication, superconductor, G-10CR insulation, support struts and bearings, vapor-cooled leads, and thermal shields for all magnets. Fabrication of all magnets was completed in March 1985. At GD, dual assembly lines were necessary during fabrication in order to meet the aggressive LLNL schedule. The entire magnet system has been installed and aligned at LLNL, and Tech Demo tests will be performed during September-November 1985

  10. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  11. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  12. Alternative connections for the large MFTF-B solenoids

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.; Wang, S.T.

    1983-01-01

    The MFTF-B central-cell solenoids are a set of twelve closely coupled, large superconducting magnets with similar but not exactly equal currents. Alternative methods of connecting them to their power supplies and dump resistors are investigated. The circuits are evaluated for operating conditions and fault conditions. The factors considered are the voltage to ground during a dump, short circuits, open circuits, quenches, and failure of the protection system to detect a quench. Of particular interest are the current induced in coils that remain superconducting when one or more coils quench. The alternative connections include separate power supplies, combined power supplies, individual dump resistors, series dump resistors and combinations of these. A new circuit that contains coupling resistors is proposed. The coupling resistors do not affect normal fast dumps but reduce the peak induced currents while also reducing the energy rating of the dump resistors. Another novel circuit, the series circuit with diodes, is discussed in detail

  13. Overview of MFTF supervisory control and diagnostics system software

    International Nuclear Information System (INIS)

    Ng, W.C.

    1979-01-01

    The Mirror Fusion Test Facility (MFTF) at the Lawrence Livermore Laboratory (LLL) is currently the largest mirror fusion research project in the world. Its Control and Diagnostics System is handled by a distributed computer network consisting of nine Interdata minicomputer systems and about 65 microprocessors. One of the design requirements is tolerance of single-point failure. If one of the computer systems becomes inoperative, the experiment can still be carried out, although the system responsiveness to operator command may be degraded. In a normal experiment cycle, the researcher can examine the result of the previous experiment, change any control parameter, fire a shot, collect four million bytes of diagnostics data, perform intershot analysis, and have the result presented - all within five minutes. The software approach adopted for the Supervisory Control and Diagnostics System features chief programmer teams and structured programming. Pascal is the standard programming language in this project

  14. Industrialization and production of neutral beam ion sources for MFTF

    International Nuclear Information System (INIS)

    Lynch, W.S.

    1981-01-01

    The existing LLNL designs of the 20 and 80kV deuterium fueled Neutral Beam Ion Source Modules (NBSM) have been industrialized and are being produced successfully for the MFTF. Industrialization includes value engineering, production engineering, cost reduction, fixturing, facilitation and procurement of components. Production assembly, inspection and testing is being performed in a large electronics manufacturing plant. Decades of experience in high voltage, high vacuum power tubes is being applied to the procedures and processes. Independent quality and reliability assurance criteria are being utilized. Scheduling of the various engineering, procurement and manufacturing task is performed by the use of a Critical Path Method (CPM) computer code, Innovative, computerized grid alignment methods were also designed and installed specifically for this project. New jointing and cleaning techniques were devised for the NBSMs. Traceability and cost control are also utilized

  15. 12-T solenoid-design options for the MFTF-B Upgrade

    International Nuclear Information System (INIS)

    Schultz, J.H.; Diatchenko, N.

    1983-01-01

    The major options for the 12 T magnets examined here are the selection of normal, superconducting or hybrid normal/superconducting magnet systems. The tradeoffs are those between the higher initial cost of superconducting magnet system, the need for thick shielding of superconducting magnets, higher recirculating power in the normal magnets and poorly characterized reliability of lightly shielded normal magnets. The size and shielding tradeoffs among these options are illustrated. The design concepts presented here are evaluated only for the first design iteration of MFTF-B + T, mentioned above. In particular, all concepts now being considered have made topological improvements in the center cell, so that neutral beam power is no longer a strong function of choke coil size. This function was strongly favorable to the use of normal magnets over superconducting magnets and its absence will be discussed qualitatively in the cost comparisons

  16. Computer control and data acquisition system for the Mirror Fusion Test Facility Ion Cyclotron Resonant Heating System (ICRH)

    International Nuclear Information System (INIS)

    Cheshire, D.L.; Thomas, R.A.

    1985-01-01

    The Lawrence Livermore National Laboratory (LLNL) large Mirror Fusion Test Facility (MFTF-B) will employ an Ion Cyclotron Resonant Heating (ICRH) system for plasma startup. As the MFTF-B Industrial Participant, TRW has responsibility for the ICRH system, including development of the data acquisition and control system. During the MFTF-B Supervisory Control and Diagnostic System (SCDS). For subsystem development and checkout at TRW, and for verification and acceptance testing at LLNL, the system will be run from a stand-alone computer system designed to simulate the functions of SCDS. The ''SCDS Simulator'' was developed originally for the MFTF-B ECRH System; descriptions of the hardware and software are updated in this paper. The computer control and data acquisition functions implemented for ICRH are described, including development status, and test schedule at TRW and at LLNL. The application software is written for the SCDS Simulator, but it is programmed in PASCAL and designed to facilitate conversion for use on the SCDS computers

  17. A Wireless Sensor Network Deployment for Rural and Forest Fire Detection and Verification

    Science.gov (United States)

    Lloret, Jaime; Garcia, Miguel; Bri, Diana; Sendra, Sandra

    2009-01-01

    Forest and rural fires are one of the main causes of environmental degradation in Mediterranean countries. Existing fire detection systems only focus on detection, but not on the verification of the fire. However, almost all of them are just simulations, and very few implementations can be found. Besides, the systems in the literature lack scalability. In this paper we show all the steps followed to perform the design, research and development of a wireless multisensor network which mixes sensors with IP cameras in a wireless network in order to detect and verify fire in rural and forest areas of Spain. We have studied how many cameras, sensors and access points are needed to cover a rural or forest area, and the scalability of the system. We have developed a multisensor and when it detects a fire, it sends a sensor alarm through the wireless network to a central server. The central server selects the closest wireless cameras to the multisensor, based on a software application, which are rotated to the sensor that raised the alarm, and sends them a message in order to receive real-time images from the zone. The camera lets the fire fighters corroborate the existence of a fire and avoid false alarms. In this paper, we show the test performance given by a test bench formed by four wireless IP cameras in several situations and the energy consumed when they are transmitting. Moreover, we study the energy consumed by each device when the system is set up. The wireless sensor network could be connected to Internet through a gateway and the images of the cameras could be seen from any part of the world. PMID:22291533

  18. A Wireless Sensor Network Deployment for Rural and Forest Fire Detection and Verification

    Directory of Open Access Journals (Sweden)

    Sandra Sendra

    2009-10-01

    Full Text Available Forest and rural fires are one of the main causes of environmental degradation in Mediterranean countries. Existing fire detection systems only focus on detection, but not on the verification of the fire. However, almost all of them are just simulations, and very few implementations can be found. Besides, the systems in the literature lack scalability. In this paper we show all the steps followed to perform the design, research and development of a wireless multisensor network which mixes sensors with IP cameras in a wireless network in order to detect and verify fire in rural and forest areas of Spain. We have studied how many cameras, sensors and access points are needed to cover a rural or forest area, and the scalability of the system. We have developed a multisensor and when it detects a fire, it sends a sensor alarm through the wireless network to a central server. The central server selects the closest wireless cameras to the multisensor, based on a software application, which are rotated to the sensor that raised the alarm, and sends them a message in order to receive real-time images from the zone. The camera lets the fire fighters corroborate the existence of a fire and avoid false alarms. In this paper, we show the test performance given by a test bench formed by four wireless IP cameras in several situations and the energy consumed when they are transmitting. Moreover, we study the energy consumed by each device when the system is set up. The wireless sensor network could be connected to Internet through a gateway and the images of the cameras could be seen from any part of the world.

  19. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume II. Integrated operations plan

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-01

    This document defines an integrated plan for the operation of the Lawrence Livermore National Laboratory (LLNL) Mirror Fusion Test Facility (MFTF-B). The plan fulfills and further delineates LLNL policies and provides for accomplishing the functions required by the program. This plan specifies the management, operations, maintenance, and engineering support responsibilities. It covers phasing into sustained operations as well as the sustained operations themselves. Administrative and Plant Engineering support, which are now being performed satisfactorily, are not part of this plan unless there are unique needs.

  20. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume II. Integrated operations plan

    International Nuclear Information System (INIS)

    1981-12-01

    This document defines an integrated plan for the operation of the Lawrence Livermore National Laboratory (LLNL) Mirror Fusion Test Facility (MFTF-B). The plan fulfills and further delineates LLNL policies and provides for accomplishing the functions required by the program. This plan specifies the management, operations, maintenance, and engineering support responsibilities. It covers phasing into sustained operations as well as the sustained operations themselves. Administrative and Plant Engineering support, which are now being performed satisfactorily, are not part of this plan unless there are unique needs

  1. A review of technology for verification of waste removal from Hanford Underground Storage Tanks (WHC Issue 30)

    International Nuclear Information System (INIS)

    Thunborg, S.

    1994-09-01

    Remediation of waste from Underground Storage Tanks (UST) at the Hanford Waste storage sites will require removal of all waste to a nearly clean condition. Current requirements are 99% clean. In order to meet remediation legal requirements, a means to remotely verify that the waste has been removed to sufficient level is needed. This report discusses the requirements for verification and reviews major technologies available for inclusion in a verification system. The report presents two operational scenarios for verification of residual waste volume. Thickness verification technologies reviewed are Ultrasonic Sensors, Capacitance Type Sensors, Inductive Sensors, Ground Penetrating Radar, and Magnetometers. Of these technologies Inductive (Metal Detectors) and Ground Penetrating Radar appear to be the most suitable for use as waste thickness sensors

  2. Noise filtering algorithm for the MFTF-B computer based control system

    International Nuclear Information System (INIS)

    Minor, E.G.

    1983-01-01

    An algorithm to reduce the message traffic in the MFTF-B computer based control system is described. The algorithm filters analog inputs to the control system. Its purpose is to distinguish between changes in the inputs due to noise and changes due to significant variations in the quantity being monitored. Noise is rejected while significant changes are reported to the control system data base, thus keeping the data base updated with a minimum number of messages. The algorithm is memory efficient, requiring only four bytes of storage per analog channel, and computationally simple, requiring only subtraction and comparison. Quantitative analysis of the algorithm is presented for the case of additive Gaussian noise. It is shown that the algorithm is stable and tends toward the mean value of the monitored variable over a wide variety of additive noise distributions

  3. Plasma potential formation and measurement in TMX-U and MFTF-B

    International Nuclear Information System (INIS)

    Grubb, D.P.

    1984-01-01

    Tandem mirrors control the axial variation of the plasma potential to create electrostatic plugs that improve the axial confinement of central cell ions and, in a thermal barrier tandem mirror, control the electron axial heat flow. Measurements of the spatial and temporal variations of the plasma potential are, therefore, important to the understanding of confinement in a tandem mirror. In this paper we discuss potential formation in a thermal barrier tandem mirror and examine the diagnostics and data obtained on the TMX-U device, including measurements of the thermal barrier potential profile using a diagnostic neutral beam and charged particle energy-spectroscopy. We then describe the heavy ion beam probe and other new plasma potential diagnostics that are under development for TMX-U and MFTF-B and examine problem areas where additional diagnostic development is desirable

  4. Safety procedures for the MFTF sustaining-neutral-beam power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1981-01-01

    The MFTF SNBPSS comprises a number of sources of potentially hazardous electrical energy in a small physical area. Power is handled at 80 kV dc, 80 A; 70 V dc, 4000 A; 25 V dc, 5500 A; 3 kV dc, 10 A; and 2 kV dc, 10 A. Power for these systems is furnished from two separate 480 V distribution systems and a 13.8 kV distribution system. A defense in depth approach is used; interlocks are provided in the hardware to make it difficult to gain access to an energized circuit, and the operating procedure includes precautions which would protect personnel even if no interlocks were working. The complexity of the system implies a complex operating procedure, and this potential complexity is controlled by presenting the procedure in a modular form using 37 separate checklists for specific operations. The checklists are presented in flowchart form, so contingencies can be handled at the lowest possible level without compromising safety

  5. Engineering study of the neutral beam and rf heating systems for DIII-D, MFTF-B, JET, JT-60 and TFTR

    International Nuclear Information System (INIS)

    Lindquist, W.B.; Staten, S.H.

    1987-01-01

    An engineering study was performed on the rf and neutral beam heating systems implemented for DIII-D, MFTF-B, JET, JT-60 and TFTR. Areas covered include: methodology used to implement the systems, technology, cost, schedule, performance, problems encountered and lessons learned. Systems are compared and contrasted in the areas studied. Summary statements were made on common problems and lessons learned. 3 refs., 6 tabs

  6. MFTF-B quasi-optical ECRH transmission system

    International Nuclear Information System (INIS)

    Yugo, J.J.; Shearer, J.W.; Ziolkowski, R.W.

    1983-01-01

    The microwave transmission system for ERCH on MFTF-B will utilize quasi-optical transmission techniques. The system consists of ten gyrotron oscillators: two gyrotrons at 28 GHz, two at 35 GHz, and six at 56 GHz. The 28 and 35 GHz gyrotrons both heat the electrons in the end plug (potential peak) while the 56 GHz sources heat the minimum-B anchor region (potential minimum). Microwaves are launched into a pair of cylindrical mirrors that form a pseudo-cavity which directs the microwaves through the plasma numerous times before they are lost out of the cavity. The cavity allows the microwave beam to reach the resonance zone over a wide range of plasma densities and temperatures. The fundamental electron cyclotron resonance moves to higher axial positions as a result of beta-depression of the magnetic field, doppler shifting of the resonance, and relativistic mass corrections for the electrons. With this system the microwave beam will reach the resonance surface at the correct angle of incidence for any density or temperature without active aiming of the antennas. The cavity also allows the beam to make multiple passes through the plasma to increase the heating efficiency at low temperatures and densities when the single pass absorption is low. In addition, neutral beams and diagnostics have an unobstructed view of the plasma

  7. BepiColombo fine sun sensor

    Science.gov (United States)

    Boslooper, Erik; van der Heiden, Nico; Naron, Daniël.; Schmits, Ruud; van der Velde, Jacob Jan; van Wakeren, Jorrit

    2017-11-01

    Design, development and verification of the passive Fine Sun Sensor (FSS) for the BepiColombo spacecraft is described. Major challenge in the design is to keep the detector at acceptable temperature levels while exposed to a solar flux intensity exceeding 10 times what is experienced in Earth orbit. A mesh type Heat Rejection Filter has been developed. The overall sensor design and its performance verification program is described.

  8. Static and dynamic analyses on the MFTF [Mirror Fusion Test Facility]-B Axicell Vacuum Vessel System: Final report

    International Nuclear Information System (INIS)

    Ng, D.S.

    1986-09-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) is a large-scale, tandem-mirror-fusion experiment. MFTF-B comprises many highly interconnected systems, including a magnet array and a vacuum vessel. The vessel, which houses the magnet array, is supported by reinforced concrete piers and steel frames resting on an array of foundations and surrounded by a 7-ft-thick concrete shielding vault. The Pittsburgh-Des Moines (PDM) Corporation, which was awarded the contract to design and construct the vessel, carried out fixed-base static and dynamic analyses of a finite-element model of the axicell vessel and magnet systems, including the simulation of various loading conditions and three postulated earthquake excitations. Meanwhile, LLNL monitored PDM's analyses with modeling studies of its own, and independently evaluated the structural responses of the vessel in order to define design criteria for the interface members and other project equipment. The assumptions underlying the finite-element model and the behavior of the axicell vessel are described in detail in this report, with particular emphasis placed on comparing the LLNL and PDM studies and on analyzing the fixed-base behavior with the soil-structure interaction, which occurs between the vessel and the massive concrete vault wall during a postulated seismic event. The structural members that proved sensitive to the soil effect are also reevaluated

  9. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  10. Structural analysis of the magnet system for Mirror Fusion Test Facility (MFTF). Addendum I

    International Nuclear Information System (INIS)

    Loss, K.R.; Wohlwend, J.W.

    1979-09-01

    The stress analysis refinement of the MFTF magnet system using GDSAP (General Dynamics Structural Analysis Program) and NASTRAN finite element computer models has been completed. The objective of this analysis was to calculate a more refined case and jacket stress distribution. The GDSAP model was refined in the minor radius area to yield a more detailed prediction of the stress distributions in critical areas identified by the previous analysis. Modifications in the case plate thickness (from 3.0 inches to 3.2 inches) and in the conductor pack load distribution and stiffness were included. The GDSAP model was converted to an identical NASTRAN model to determine the influence on stress results using higher order elements

  11. Sparking protection for MFTF-B Neutral Beam Power Supplies

    International Nuclear Information System (INIS)

    Cummings, D.B.

    1983-01-01

    This paper describes the upgrade of MFTF-B Neutral Beam Power Supplies for sparking protection. High performance ion sources spark repeatedly so ion source power supplies must be insensitive to sparking. The hot deck houses the series tetrode, arc and filament supplies, and controls. Hot deck shielding has been upgraded and a continuous shield around the arc, filament, gradient grid, and control cables now extends from the hot deck, through the core snubber, to the source. The shield carries accelerating current and connects only to the source. Shielded source cables go through an outer duct which now connects to a ground plane under the hot deck. This hybrid transmission line is a low inductance path for sparks discharging the stray capacitance of the hot deck and isolation transformers, reducing coupling to building steel. Parallel DC current return cables inside the duct lower inductance to reduce inductive turn-off transients. MOVs to ground further limit surges in the remote power supply return. Single point grounding is at the source. No control or rectifier components have been damaged nor are there any known malfunctions due to sparking up to 80 kV output

  12. Sparking protection for MFTF-B neutral beam power supplies

    International Nuclear Information System (INIS)

    Cummings, D.B.

    1983-01-01

    This paper describes the upgrade of MFTF-B Neutral Beam Power Supplies for sparking protection. High performance ion sources spark repeatedly so ion source power supplies must be insensitive to sparking. The hot deck houses the series tetrode, arc and filament supplies, and controls. Hot deck shielding has been upgraded and a continuous shield around the arc, filament, gradient grid, and control cables now extends from the hot deck, through the core snubber, to the source. The shield carries accelerating current and connects only to the source. Shielded source cables go through an outer duct which now connects to a ground plane under the hot deck. This hybrid transmission line is a low inductance path for sparks discharging the stray capacitance of the hot deck and isolation transformers, reducing coupling to building steel. Parallel dc current return cables inside the duct lower inductance to reduce inductive turn-off transients. MOVs to ground further limit surges in the remote power supply return. Single point grounding is at the source. No control or rectifier components have been damaged nor are there any known malfunctions due to sparking up to 80 kV output

  13. Currents and voltages in the MFTF coils during the formation of a normal zone

    International Nuclear Information System (INIS)

    Owen, E.W.

    1980-08-01

    Expressions are obtained for the currents and voltages in a pair of inductively coupled superconducting coils under two conditions: formation of a normal zone and during a change in the level of the current in one coil. A dump resistor of low resistance and a detector bridge is connected across each coil. Calculated results are given for the MFTF coils. The circuit equations during formation of a normal zone are nonlinear and time-varying, consequently, only a series solution is possible. The conditions during a change in current are more easily found. After the transient has died away, the voltages in the coil associated with the changing source are all self-inductive, while the voltages in the other coil are all mutually inductive

  14. Design and prototype results of a far-infrared interferometer for MFTF-B

    International Nuclear Information System (INIS)

    Monjes, J.A.; Throop, A.L.; Thomas, S.R.; Peebles, A.; Zu, Qin-Zin.

    1983-01-01

    A Far-Infrared (FIR) Laser Interferometer (FLI), operating at 185 μm wavelength is planned as part of the initial start-up set of plasma diagnostics for the Mirror Fusion Test Facility (MFTF-B). The FLI will consist of a heterodyne, three-chord laser interferometer which will be used initially to measure line-integrated plasma density in the high-density, center cell region of the machine. The conceptual system design and analysis has been completed. There are several unique environmental/physical constraints and performance requirements for this system which have required that technology-evaluation and prototyping experiments be completed to support the design effort and confirm the expected performance parameters. Issues which have been addressed include extensive use of long-path dielectric waveguide, coupling and control of free-space propagation of the beam, and polarization control. The results and conclusions of the design analysis and experimental measurements will be presented

  15. Application of structural mechanics methods to the design of large tandem mirror fusion devices (MFTF-B)

    International Nuclear Information System (INIS)

    Karpenko, V.N.; Ng, D.S.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory requires state-of-the-art structural-mechanics methods to deal with access constraints for plasma heating and diagnostics, alignment requirements, and load complexity and variety. Large interactive structures required an integrated analytical approach to achieve a resonable level of overall system optimization. The Tandem Magnet Generator (TMG) creates a magnet configuration for the EFFI calculation of electromagnetic-field forces that, coupled with other loads, form the input loading to magnetic and vessel finite-element models. The anlytical results provide the data base for detailed design of magnet, vessel, foundation, and interaction effects. (orig.)

  16. High field Nb3Sn Axicell insert coils for the Mirror Fusion Test Facility-B (MFTF-B) axicell configuration. Final report

    International Nuclear Information System (INIS)

    Baldi, R.W.; Tatro, R.E.; Scanlan, R.M.

    1984-03-01

    Two 12-tesla superconducting insert coils are being designed by General Dynamics Convair Division for the axicell regions of MFTF-B for Lawrence Livermore National Laboratory. A major challenge of this project is to ensure that combined fabrication and operational strains induced in the conductor are within stringent limitations of the relatively brittle Nb 3 Sn superconductor filaments. These coils are located in the axicell region of MFTF-B. They have a clear-bore diameter of 36.195cm (14.25 inches) and consist of 27 double pancakes (i.e., 54 pancakes per coil) would on an electrically insulated 304LN stainless steel/bobbin helium vessel. Each pancake has 57 turns separated by G-10CR insulation. The complete winding bundle has 4.6 million ampere-turns and uniform current density of 2007 A/cm 2 . In conjunction with the other magnets in the system, they produce a 12-tesla central field and a 12.52-tesla peak field. A multifilamentary Nb 3 Sn conductor was selected to meet these requirements. The conductor consists of a monolithic insert soldered into a copper stabilizer. Sufficient cross-sectional area and work-hardening of the copper stabilizer has been provided for the conductor to self-react the electromagnetic Lorentz force induced hoop stresses with normal operational tensile strains less than 0.07 percent

  17. High field Nb/sub 3/Sn Axicell insert coils for the Mirror Fusion Test Facility-B (MFTF-B) axicell configuration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Baldi, R.W.; Tatro, R.E.; Scanlan, R.M.; Agarwal, K.L.; Bailey, R.E.; Burgeson, J.E.; Kim, I.K.; Magnuson, G.D.; Mallett, B.D.; Pickering, J.L.

    1984-03-01

    Two 12-tesla superconducting insert coils are being designed by General Dynamics Convair Division for the axicell regions of MFTF-B for Lawrence Livermore National Laboratory. A major challenge of this project is to ensure that combined fabrication and operational strains induced in the conductor are within stringent limitations of the relatively brittle Nb/sub 3/Sn superconductor filaments. These coils are located in the axicell region of MFTF-B. They have a clear-bore diameter of 36.195cm (14.25 inches) and consist of 27 double pancakes (i.e., 54 pancakes per coil) would on an electrically insulated 304LN stainless steel/bobbin helium vessel. Each pancake has 57 turns separated by G-10CR insulation. The complete winding bundle has 4.6 million ampere-turns and uniform current density of 2007 A/cm/sup 2/. In conjunction with the other magnets in the system, they produce a 12-tesla central field and a 12.52-tesla peak field. A multifilamentary Nb/sub 3/Sn conductor was selected to meet these requirements. The conductor consists of a monolithic insert soldered into a copper stabilizer. Sufficient cross-sectional area and work-hardening of the copper stabilizer has been provided for the conductor to self-react the electromagnetic Lorentz force induced hoop stresses with normal operational tensile strains less than 0.07 percent.

  18. Application of structural-mechanics methods to the design of large tandem-mirror fusion devices (MFTF-B). Revision 1

    International Nuclear Information System (INIS)

    Karpenko, V.N.; Ng, D.S.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory requires state-of-the-art structural-mechanics methods to deal with access constraints for plasma heating and diagnostics, alignment requirements, and load complexity and variety. Large interactive structures required an integrated analytical approach to achieve a reasonable level of overall system optimization. The Tandem Magnet Generator (TMG) creates a magnet configuration for the EFFI calculation of electromagnetic-field forces that, coupled with other loads, form the input loading to magnet and vessel finite-element models. The analytical results provide the data base for detailed design of magnet, vessel, foundation, and interaction effects. 13 refs

  19. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    Directory of Open Access Journals (Sweden)

    Helala AlShehri

    2018-03-01

    Full Text Available The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  20. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    Science.gov (United States)

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  1. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  2. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 £ 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  3. Tropospheric Airborne Meteorological Data Reporting (TAMDAR) Sensor Validation and Verification on National Oceanographic and Atmospheric Administration (NOAA) Lockheed WP-3D Aircraft

    Science.gov (United States)

    Tsoucalas, George; Daniels, Taumi S.; Zysko, Jan; Anderson, Mark V.; Mulally, Daniel J.

    2010-01-01

    As part of the National Aeronautics and Space Administration's Aviation Safety and Security Program, the Tropospheric Airborne Meteorological Data Reporting project (TAMDAR) developed a low-cost sensor for aircraft flying in the lower troposphere. This activity was a joint effort with support from Federal Aviation Administration, National Oceanic and Atmospheric Administration, and industry. This paper reports the TAMDAR sensor performance validation and verification, as flown on board NOAA Lockheed WP-3D aircraft. These flight tests were conducted to assess the performance of the TAMDAR sensor for measurements of temperature, relative humidity, and wind parameters. The ultimate goal was to develop a small low-cost sensor, collect useful meteorological data, downlink the data in near real time, and use the data to improve weather forecasts. The envisioned system will initially be used on regional and package carrier aircraft. The ultimate users of the data are National Centers for Environmental Prediction forecast modelers. Other users include air traffic controllers, flight service stations, and airline weather centers. NASA worked with an industry partner to develop the sensor. Prototype sensors were subjected to numerous tests in ground and flight facilities. As a result of these earlier tests, many design improvements were made to the sensor. The results of tests on a final version of the sensor are the subject of this report. The sensor is capable of measuring temperature, relative humidity, pressure, and icing. It can compute pressure altitude, indicated air speed, true air speed, ice presence, wind speed and direction, and eddy dissipation rate. Summary results from the flight test are presented along with corroborative data from aircraft instruments.

  4. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  5. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  6. Design lessons from using programmable controllers in the MFTF-B personnel safety and interlocks system

    International Nuclear Information System (INIS)

    Branum, J.D.

    1983-01-01

    Applying programmable controllers in critical applications such as personnel safety and interlocks systems requires special considerations in the design of both hardware and software. All modern programmable controller systems feature extensive internal diagnostic capabilities to protect against problems such as program memory errors; however most, if not all present designs lack an intrinsic capability for detecting and countering failures on the field-side of their I/O modules. Many of the most common styles of I/O modules can also introduce potentially dangerous sneak circuits, even without component failure. This paper presents the most significant lessons learned to date in the design of the MFTF-B Personnel Safety and Interlocks System, which utilizes two non-redundant programmable controllers with over 800 I/O points each. Specific problems recognized during the design process as well as those discovered during initial testing and operation are discussed along with their specific solutions in hardware and software

  7. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume I. Organization plan

    International Nuclear Information System (INIS)

    1981-12-01

    This plan and the accompanying MFTF-B Integrated Operations Plan are submitted in response to UC/LLNL Purchase Order 3883801, dated July 1981. The organization plan also addresses the specific tasks and trade studies directed by the scope of work. The Integrated Operations Plan, which includes a reliability, quality assurance, and safety plan and an integrated logistics plan, comprises the burden of the report. In the first section of this volume, certain underlying assumptions and observations are discussed setting the requirements and limits for organization. Section B presents the recommended structure itself. Section C Device Availability vs Maintenance and Support Efforts and Section D Staffing Levels and Skills provide backup detail and justification. Section E is a trade study on maintenance and support by LLNL staff vs subcontract and Section F is a plan for transitioning from the construction phase into operation. A brief summary of schedules and estimated costs concludes the volume

  8. Computer model of the MFTF-B neutral beam Accel dc power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1983-01-01

    Using the SCEPTRE circuit modeling code, a computer model was developed for the MFTF Neutral Beam Power Supply System (NBPSS) Accel dc Power Supply (ADCPS). The ADCPS provides 90 kV, 88 A, to the Accel Modulator. Because of the complex behavior of the power supply, use of the computer model is necessary to adequately understand the power supply's behavior over a wide range of load conditions and faults. The model developed includes all the circuit components and parameters, and some of the stray values. The model has been well validated for transients with times on the order of milliseconds, and with one exception, for steady-state operation. When using a circuit modeling code for a system with a wide range of time constants, it can become impossible to obtain good solutions for all time ranges at once. The present model concentrates on the millisecond-range transients because the compensating capacitor bank tends to isolate the power supply from the load for faster transients. Attempts to include stray circuit elements with time constants in the microsecond and shorter range have had little success because of huge increases in computing time that result. The model has been successfully extended to include the accel modulator

  9. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  10. Design and test of-80 kV snubber core assemblies for MFTF sustaining-neutral-beam power supplies

    International Nuclear Information System (INIS)

    Bishop, S.R.; Mayhall, D.J.; Wilson, J.H.; De Vore, K.R.; Ross, R.I.; Sears, R.G.

    1981-01-01

    Core snubbers, located near the neutral beam source ends of the Mirror Fusion Test Facility (MFTF) Sustaining Neutral Beam Power Supply System (SNBPSS) source cables, protect the neutral beam source extractor grid wires from overheating and sputtering during internal sparkdowns. The snubbers work by producing an induced counter-emf which limits the fault current and by absorbing the capacitive energy stored on the 80 kV source cables and power supplies. A computer program STACAL was used in snubber magnetic design to choose appropriate tape wound cores to provide 400 Ω resistance and 25 J energy absorption. The cores are mounted horizontally in a dielectric structure. The central source cable bundle passes through the snubber and terminates on three copper buses. Multilam receptacles on the buses connect to the source module jumper cables. Corona rings and shields limit electric field stresses to allow close clearances between snubbers

  11. Sensor and Communication Network Technology for Harsh Environments in the Nuclear Power Plant

    International Nuclear Information System (INIS)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Park, Hee Yoon; Hong, Seok Bong; Koo, In Soo

    2008-02-01

    One of the challenges in harsh environments qualification and verification for emerging new I and C system of the nuclear power plant is to define the operational environment of these new emerging I and C sensor and communication network such that they are tested to the limits of a mission without requiring expensive over design. To aid this, this report defines, discusses and recommends environmental guideline and verification requirements for using state-of-the-art RPS sensors, fiber optic communication system, wireless communication and wireless smart sensors in nuclear harsh environments. This report focuses on advances in sensors (e.g., temperature, pressure, neutron and thermal power sensors) and their potential impact. Discussed are: radiation, thermal, electromagnetic, and electrical environment specifications. Presented are the typical performance data (survivability guidelines and experimental data), evaluation procedure and standard test method of communication devices, state-of-the-art RPS sensors, and communication systems

  12. Sensor and Communication Network Technology for Harsh Environments in the Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Park, Hee Yoon; Hong, Seok Bong; Koo, In Soo

    2008-02-15

    One of the challenges in harsh environments qualification and verification for emerging new I and C system of the nuclear power plant is to define the operational environment of these new emerging I and C sensor and communication network such that they are tested to the limits of a mission without requiring expensive over design. To aid this, this report defines, discusses and recommends environmental guideline and verification requirements for using state-of-the-art RPS sensors, fiber optic communication system, wireless communication and wireless smart sensors in nuclear harsh environments. This report focuses on advances in sensors (e.g., temperature, pressure, neutron and thermal power sensors) and their potential impact. Discussed are: radiation, thermal, electromagnetic, and electrical environment specifications. Presented are the typical performance data (survivability guidelines and experimental data), evaluation procedure and standard test method of communication devices, state-of-the-art RPS sensors, and communication systems.

  13. Open and Crowd-Sourced Data for Treaty Verification

    Science.gov (United States)

    2014-10-01

    cations – from enhancing home security to providing novel marketing tools for commerce – they are widely available and inexpensive. These open...we anticipate increasing the density of public-domain seis- mic sensor coverage in regions where high population density coincides with seismic hazards...interface and metadata standards emerge through the med- ical device market , it makes sense to adhere to these standards for any verification-optimized

  14. Location verification algorithm of wearable sensors for wireless body area networks.

    Science.gov (United States)

    Wang, Hua; Wen, Yingyou; Zhao, Dazhe

    2018-01-01

    Knowledge of the location of sensor devices is crucial for many medical applications of wireless body area networks, as wearable sensors are designed to monitor vital signs of a patient while the wearer still has the freedom of movement. However, clinicians or patients can misplace the wearable sensors, thereby causing a mismatch between their physical locations and their correct target positions. An error of more than a few centimeters raises the risk of mistreating patients. The present study aims to develop a scheme to calculate and detect the position of wearable sensors without beacon nodes. A new scheme was proposed to verify the location of wearable sensors mounted on the patient's body by inferring differences in atmospheric air pressure and received signal strength indication measurements from wearable sensors. Extensive two-sample t tests were performed to validate the proposed scheme. The proposed scheme could easily recognize a 30-cm horizontal body range and a 65-cm vertical body range to correctly perform sensor localization and limb identification. All experiments indicate that the scheme is suitable for identifying wearable sensor positions in an indoor environment.

  15. Design and Experimental Verification of a 0.19 V 53 μW 65 nm CMOS Integrated Supply-Sensing Sensor With a Supply-Insensitive Temperature Sensor and an Inductive-Coupling Transmitter for a Self-Powered Bio-sensing System Using a Biofuel Cell.

    Science.gov (United States)

    Kobayashi, Atsuki; Ikeda, Kei; Ogawa, Yudai; Kai, Hiroyuki; Nishizawa, Matsuhiko; Nakazato, Kazuo; Niitsu, Kiichi

    2017-12-01

    In this paper, we present a self-powered bio-sensing system with the capability of proximity inductive-coupling communication for supply sensing and temperature monitoring. The proposed bio-sensing system includes a biofuel cell as a power source and a sensing frontend that is associated with the CMOS integrated supply-sensing sensor. The sensor consists of a digital-based gate leakage timer, a supply-insensitive time-domain temperature sensor, and a current-driven inductive-coupling transmitter and achieves low-voltage operation. The timer converts the output voltage from a biofuel cell to frequency. The temperature sensor provides a pulse width modulation (PWM) output that is not dependent on the supply voltage, and the associated inductive-coupling transmitter enables proximity communication. A test chip was fabricated in 65 nm CMOS technology and consumed 53 μW with a supply voltage of 190 mV. The low-voltage-friendly design satisfied the performance targets of each integrated sensor without any trimming. The chips allowed us to successfully demonstrate proximity communication with an asynchronous receiver, and the measurement results show the potential for self-powered operation using biofuel cells. The analysis and experimental verification of the system confirmed their robustness.

  16. A computer model of the MFTF-B neutral beam accel dc power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1983-01-01

    Using the SCEPTRE circuit modeling code, a computer model was developed for the MFTF Neutral Beam Power Supply System (NBPSS) Accel DC Power Supply (ADCPS). The ADCPS provides 90 kV, 88 A, to the Accel Modulator. Because of the complex behavior of the power supply, use of the computer model is necessary to adequately understand the power supply's behavior over a wide range of load conditions and faults. The model developed includes all the circuit components and parameters, and some of the stray values. The model has been well validated for transients with times on the order of milliseconds, and with one exception, for steady-state operation. When using a circuit modeling code for a system with a wide range of time constants, it can become impossible to obtain good solutions for all time ranges at once. The present model concentrates on the millisecond-range transients because the compensating capacitor bank tends to isolate the power supply from the load for faster transients. Attempts to include stray circuit elements with time constants in the microsecond and shorter range have had little success because of hugh increases in computing time that result. The model has been successfully extended to include the accel modulator

  17. Development of novel EMAT-ECT multi-sensor and verification of its feasibility

    International Nuclear Information System (INIS)

    Suzuki, Kenichiro; Uchimoto, Tetsuya; Takagi, Toshiyuki; Sato, Takeshi; Guy, Philippe; Casse, Amelie

    2006-01-01

    In this study, we propose a novel EMAT-ECT multi sensor aiming at advanced structural health monitoring. For the purpose, proto-type EMAT-ECT multi-sensor was developed and their functions both as ECT and EMAT prove were evaluated. Experimental results of pulse ECT using the EMAT-ECT multi-sensor showed that the proposed sensor has a capability of detection and sizing of flaws. Experimental results of EMAT evaluation using the EMAT-ECT multi-sensor showed that ultrasonic wave was transmitted by EMAT-ECT multi sensor and flaw echo was observed. These results imply that EMAT-ECT multi sensor is available for pulse ECT and EMAT. (author)

  18. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  20. Optical Imaging Sensors and Systems for Homeland Security Applications

    CERN Document Server

    Javidi, Bahram

    2006-01-01

    Optical and photonic systems and devices have significant potential for homeland security. Optical Imaging Sensors and Systems for Homeland Security Applications presents original and significant technical contributions from leaders of industry, government, and academia in the field of optical and photonic sensors, systems and devices for detection, identification, prevention, sensing, security, verification and anti-counterfeiting. The chapters have recent and technically significant results, ample illustrations, figures, and key references. This book is intended for engineers and scientists in the relevant fields, graduate students, industry managers, university professors, government managers, and policy makers. Advanced Sciences and Technologies for Security Applications focuses on research monographs in the areas of -Recognition and identification (including optical imaging, biometrics, authentication, verification, and smart surveillance systems) -Biological and chemical threat detection (including bios...

  1. Qualitative and Quantitative Security Analyses for ZigBee Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender

    methods and techniques in different areas and brings them together to create an efficient verification system. The overall ambition is to provide a wide range of powerful techniques for analyzing models with quantitative and qualitative security information. We stated a new approach that first verifies...... applications, home automation, and traffic control. The challenges for research in this area are due to the unique features of wireless sensor devices such as low processing power and associated low energy. On top of this, wireless sensor networks need secure communication as they operate in open fields...... low level security protocol s in a qualitative manner and guarantees absolute security, and then takes these verified protocols as actions of scenarios to be verified in a quantitative manner. Working on the emerging ZigBee wireless sensor networks, we used probabilistic verification that can return...

  2. Biomimetic actuator and sensor for robot hand

    International Nuclear Information System (INIS)

    Kim, Baekchul; Chung, Jinah; Cho, Hanjoung; Shin, Seunghoon; Lee, Hyoungsuk; Moon, Hyungpil; Choi, Hyouk Ryeol; Koo, Jachoon

    2012-01-01

    To manufacture a robot hand that essentially mimics the functions of a human hand, it is necessary to develop flexible actuators and sensors. In this study, we propose the design, manufacture, and performance verification of flexible actuators and sensors based on Electro Active Polymer (EAP). EAP is fabricated as a type of film, and it moves with changes in the voltage because of contraction and expansion in the polymer film. Furthermore, if a force is applied to an EAP film, its thickness and effective area change, and therefore, the capacitance also changes. By using this mechanism, we produce capacitive actuators and sensors. In this study, we propose an EAP based capacitive sensor and evaluate its use as a robot hand sensor

  3. Biomimetic actuator and sensor for robot hand

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Baekchul; Chung, Jinah; Cho, Hanjoung; Shin, Seunghoon; Lee, Hyoungsuk; Moon, Hyungpil; Choi, Hyouk Ryeol; Koo, Jachoon [Sungkyunkwan Univ., Seoul (Korea, Republic of)

    2012-12-15

    To manufacture a robot hand that essentially mimics the functions of a human hand, it is necessary to develop flexible actuators and sensors. In this study, we propose the design, manufacture, and performance verification of flexible actuators and sensors based on Electro Active Polymer (EAP). EAP is fabricated as a type of film, and it moves with changes in the voltage because of contraction and expansion in the polymer film. Furthermore, if a force is applied to an EAP film, its thickness and effective area change, and therefore, the capacitance also changes. By using this mechanism, we produce capacitive actuators and sensors. In this study, we propose an EAP based capacitive sensor and evaluate its use as a robot hand sensor.

  4. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  5. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  6. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    Science.gov (United States)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  7. The Linearity of Optical Tomography: Sensor Model and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Siti Zarina MOHD. MUJI

    2011-09-01

    Full Text Available The aim of this paper is to show the linearization of optical sensor. Linearity of the sensor response is a must in optical tomography application, which affects the tomogram result. Two types of testing are used namely, testing using voltage parameter and testing with time unit parameter. For the former, the testing is by measuring the voltage when the obstacle is placed between transmitter and receiver. The obstacle diameters are between 0.5 until 3 mm. The latter is also the same testing but the obstacle is bigger than the former which is 59.24 mm and the testing purpose is to measure the time unit spend for the ball when it cut the area of sensing circuit. Both results show a linear relation that proves the optical sensors is suitable for process tomography application.

  8. Response time verification of in situ hydraulic pressure sensors in a nuclear reactor

    International Nuclear Information System (INIS)

    Foster, C.G.

    1978-01-01

    A method and apparatus for verifying response time in situ of hydraulic pressure and pressure differential sensing instrumentation in a nuclear circuit is disclosed. Hydraulic pressure at a reference sensor and at an in situ process sensor under test is varied according to a linear ramp. Sensor response time is then determined by comparison of the sensor electrical analog output signals. The process sensor is subjected to a relatively slowly changing and a relatively rapidly changing hydraulic pressure ramp signal to determine an upper bound for process sensor response time over the range of all pressure transients to which the sensor is required to respond. Signal linearity is independent of the volumetric displacement of the process sensor. The hydraulic signal generator includes a first pressurizable gas reservoir, a second pressurizable liquid and gas reservoir, a gate for rapidly opening a gas communication path between the two reservoirs, a throttle valve for regulating rate of gas pressure equalization between the two reservoirs, and hydraulic conduit means for simultaneously communicating a ramp of hydraulic pressure change between the liquid/gas reservoir and both a reference and a process sensor. By maintaining a sufficient pressure differential between the reservoirs and by maintaining a sufficient ratio of gas to liquid in the liquid/gas reservoir, excellent linearity and minimal transient effects can be achieved for all pressure ranges, magnitudes, and rates of change of interest

  9. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  10. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  11. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  12. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  13. Multi-sensor Cloud Retrieval Simulator and Remote Sensing from Model Parameters . Pt. 1; Synthetic Sensor Radiance Formulation; [Synthetic Sensor Radiance Formulation

    Science.gov (United States)

    Wind, G.; DaSilva, A. M.; Norris, P. M.; Platnick, S.

    2013-01-01

    In this paper we describe a general procedure for calculating synthetic sensor radiances from variable output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint, the algorithm takes explicit account of the model subgrid variability, in particular its description of the probability density function of total water (vapor and cloud condensate.) The simulated sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies.We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products). We focus on clouds because they are very important to model development and improvement.

  14. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    Science.gov (United States)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  15. Fast regional readout CMOS image sensor for dynamic MLC tracking

    International Nuclear Information System (INIS)

    Zin, H; Harris, E; Osmond, J; Evans, P

    2014-01-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ∼400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  16. Advances in the Processing of VHR Optical Imagery in Support of Safeguards Verification

    International Nuclear Information System (INIS)

    Niemeyer, I.; Listner, C.; Canty, M.

    2015-01-01

    Under the Additional Protocol of the Non-Proliferation Treaty (NPT) complementing the safeguards agreements between States and the International Atomic Energy Agency, commercial satellite imagery, preferably acquired by very high-resolution (VHR) satellite sensors, is an important source of safeguards-relevant information. Satellite imagery can assist in the evaluation of site declarations, design information verification, the detection of undeclared nuclear facilities, and the preparation of inspections or other visits. With the IAEA's Geospatial Exploitation System (GES), satellite imagery and other geospatial information such as site plans of nuclear facilities are available for a broad range of inspectors, analysts and country officers. The demand for spatial information and new tools to analyze this data is growing, together with the rising number of nuclear facilities under safeguards worldwide. Automated computer-driven processing of satellite imagery could therefore add a big value in the safeguards verification process. These could be, for example, satellite imagery pre-processing algorithms specially developed for new sensors, tools for pixel or object-based image analysis, or geoprocessing tools that generate additional safeguards-relevant information. In the last decade procedures for automated (pre-) processing of satellite imagery have considerably evolved. This paper aims at testing some pixel-based and object-based procedures for automated change detection and classification in support of safeguards verification. Taking different nuclear sites as examples, these methods will be evaluated and compared with regard to their suitability to (semi-) automatically extract safeguards-relevant information. (author)

  17. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  18. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  19. Low-Cost Planar PTF Sensors for the Identity Verification of Smartcard Holders

    NARCIS (Netherlands)

    Henderson, N.J.; Papakostas, T.V.; White, N.M.; Hartel, Pieter H.

    The properties of mechanical flexibility, low-cost and planar geometry make polymer thick film (PTF) sensors attractive for embedded smartcard biometrics. PTF piezoelectric and piezoresistive pressure sensors are investigated for their potential to capture spatial human characteristics. However, it

  20. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  1. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    International Nuclear Information System (INIS)

    Brau-Avila, A; Valenzuela-Galvan, M; Herrera-Jimenez, V M; Santolaria, J; Aguilar, J J; Acero, R

    2017-01-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs. (paper)

  2. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    Science.gov (United States)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  3. Measurement and Verification of Energy Savings and Performance from Advanced Lighting Controls

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-02-21

    This document provides a framework for measurement and verification (M&V) of energy savings, performance, and user satisfaction from lighting retrofit projects involving occupancy-sensor-based, daylighting, and/or other types of automatic lighting. It was developed to provide site owners, contractors, and other involved organizations with the essential elements of a robust M&V plan for retrofit projects and to assist in developing specific project M&V plans.

  4. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  5. SENSORS FAULT DIAGNOSIS ALGORITHM DESIGN OF A HYDRAULIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Matej ORAVEC

    2017-06-01

    Full Text Available This article presents the sensors fault diagnosis system design for the hydraulic system, which is based on the group of the three fault estimation filters. These filters are used for estimation of the system states and sensors fault magnitude. Also, this article briefly stated the hydraulic system state control design with integrator, which is important assumption for the fault diagnosis system design. The sensors fault diagnosis system is implemented into the Matlab/Simulink environment and it is verified using the controlled hydraulic system simulation model. Verification of the designed fault diagnosis system is realized by series of experiments, which simulates sensors faults. The results of the experiments are briefly presented in the last part of this article.

  6. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  7. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  8. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    Science.gov (United States)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  9. Improved detection limits of bacterial endotoxins using new type of planar interdigital sensors

    KAUST Repository

    Syaifudin, A. R Mohd

    2012-10-01

    New types of planar interdigital sensors were fabricated by photolithography and etching techniques on a Silicon/Silicon Dioxide (Si/SiO2) wafer (single side polished). The sensors were then coated with APTES (3-aminopropyltrietoxysilane) a cross linker used to bind Polymyxin B (PmB) molecules on electrodes surface. PmB is an antimicrobial peptide produced by the Gram-positive bacterium-Bacillus which has specific binding properties to Lipopolysaccharide (LPS). This paper will discuss the fabrication process, coating and immobilization procedures and analysis of sensors\\' performance based on Impedance Spectroscopy method. The sensor sensitivity was compared to standard ToxinSensor Chromogenic LAL Endotoxin Assay Kit for verification. © 2012 IEEE.

  10. Continuous and recurrent testing of acoustic emission sensors

    International Nuclear Information System (INIS)

    Sause, Markus G.R.; Schmitt, Stefan; Potstada, Philipp

    2017-01-01

    In many fields of application of acoustic emission, the testing can lead to a lasting change in the sensor characteristics. This can be caused by mechanical damage, thermal stress or use under aggressive environmental conditions. Irrespective of visually testable damages of the sensors, a shift in the spectral sensitivity, a reduction in the absolute sensitivity or a reduction in the signal-to-noise ratio can occur. During the test, this requires a possibility to periodically check the sensors, including the coupling aids used. For recurring testing, recommendations are given in Directive SE 02 ''Verification of acoustic emission sensors and their coupling in the laboratory''. This paper discusses possibilities for continuous monitoring of the sensors during the test and presents an application example for the partly automated recurring testing of acoustic emission sensors using Directive SE 02. For this purpose, a test stand for the supply of the sensors to be tested was constructed and the signal recording and data reduction implemented in freely available software programs. The operating principle is demonstrated using selected case studies. [de

  11. An Embedded Sensor Node Microcontroller with Crypto-Processors.

    Science.gov (United States)

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-04-27

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed.

  12. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  13. Computer code determination of tolerable accel current and voltage limits during startup of an 80 kV MFTF sustaining neutral beam source

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Eckard, R.D.

    1979-01-01

    We have used a Lawrence Livermore Laboratory (LLL) version of the WOLF ion source extractor design computer code to determine tolerable accel current and voltage limits during startup of a prototype 80 kV Mirror Fusion Test Facility (MFTF) sustaining neutral beam source. Arc current limits are also estimated. The source extractor has gaps of 0.236, 0.721, and 0.155 cm. The effective ion mass is 2.77 AMU. The measured optimum accel current density is 0.266 A/cm 2 . The gradient grid electrode runs at 5/6 V/sub a/ (accel voltage). The suppressor electrode voltage is zero for V/sub a/ < 3 kV and -3 kV for V/sub a/ greater than or equal to 3 kV. The accel current density for optimum beam divergence is obtained for 1 less than or equal to V/sub a/ less than or equal to 80 kV, as are the beam divergence and emittance

  14. A Probabilistic Mass Estimation Algorithm for a Novel 7- Channel Capacitive Sample Verification Sensor

    Science.gov (United States)

    Wolf, Michael

    2012-01-01

    A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.

  15. Verification of Remote Inspection Techniques for Reactor Internal Structures of Liquid Metal Reactor

    International Nuclear Information System (INIS)

    Joo, Young Sang; Lee, Jae Han

    2007-02-01

    The reactor internal structures and components of a liquid metal reactor (LMR) are submerged in hot sodium of reactor vessel. The division 3 of ASME code section XI specifies the visual inspection as major in-service inspection (ISI) methods of reactor internal structures and components. Reactor internals of LMR can not be visually examined due to opaque liquid sodium. The under-sodium viewing techniques using an ultrasonic wave should be applied for the visual inspection of reactor internals. Recently, an ultrasonic waveguide sensor with a strip plate has been developed for an application to the under-sodium inspection. In this study, visualization technique, ranging technique and monitoring technique have been suggested for the remote inspection of reactor internals by using the waveguide sensor. The feasibility of these remote inspection techniques using ultrasonic waveguide sensor has been evaluated by an experimental verification

  16. Verification of Remote Inspection Techniques for Reactor Internal Structures of Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Young Sang; Lee, Jae Han

    2007-02-15

    The reactor internal structures and components of a liquid metal reactor (LMR) are submerged in hot sodium of reactor vessel. The division 3 of ASME code section XI specifies the visual inspection as major in-service inspection (ISI) methods of reactor internal structures and components. Reactor internals of LMR can not be visually examined due to opaque liquid sodium. The under-sodium viewing techniques using an ultrasonic wave should be applied for the visual inspection of reactor internals. Recently, an ultrasonic waveguide sensor with a strip plate has been developed for an application to the under-sodium inspection. In this study, visualization technique, ranging technique and monitoring technique have been suggested for the remote inspection of reactor internals by using the waveguide sensor. The feasibility of these remote inspection techniques using ultrasonic waveguide sensor has been evaluated by an experimental verification.

  17. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  18. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  19. Development of a Meso-Scale Fiberoptic Rotation Sensor for a Torsion Actuator.

    Science.gov (United States)

    Sheng, Jun; Desai, Jaydev P

    2018-01-01

    This paper presents the development of a meso-scale fiberoptic rotation sensor for a shape memory alloy (SMA) torsion actuator for neurosurgical applications. Within the sensor, a rotary head with a reflecting surface is capable of modulating the light intensity collected by optical fibers when the rotary head is coupled to the torsion actuator. The mechanism of light intensity modulation is modeled, followed by experimental model verification. Meanwhile, working performances for different rotary head designs, optical fibers, and fabrication materials are compared. After the calibration of the fiberoptic rotation sensor, the sensor is capable of precisely measuring rotary motion and controlling the SMA torsion actuator with feedback control.

  20. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  1. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  2. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  3. Threats of Password Pattern Leakage Using Smartwatch Motion Recognition Sensors

    Directory of Open Access Journals (Sweden)

    Jihun Kim

    2017-06-01

    Full Text Available Thanks to the development of Internet of Things (IoT technologies, wearable markets have been growing rapidly. Smartwatches can be said to be the most representative product in wearable markets, and involve various hardware technologies in order to overcome the limitations of small hardware. Motion recognition sensors are a representative example of those hardware technologies. However, smartwatches and motion recognition sensors that can be worn by users may pose security threats of password pattern leakage. In the present paper, passwords are inferred through experiments to obtain password patterns inputted by users using motion recognition sensors, and verification of the results and the accuracy of the results is shown.

  4. Resonance-induced sensitivity enhancement method for conductivity sensors

    Science.gov (United States)

    Tai, Yu-Chong (Inventor); Shih, Chi-yuan (Inventor); Li, Wei (Inventor); Zheng, Siyang (Inventor)

    2009-01-01

    Methods and systems for improving the sensitivity of a variety of conductivity sensing devices, in particular capacitively-coupled contactless conductivity detectors. A parallel inductor is added to the conductivity sensor. The sensor with the parallel inductor is operated at a resonant frequency of the equivalent circuit model. At the resonant frequency, parasitic capacitances that are either in series or in parallel with the conductance (and possibly a series resistance) is substantially removed from the equivalent circuit, leaving a purely resistive impedance. An appreciably higher sensor sensitivity results. Experimental verification shows that sensitivity improvements of the order of 10,000-fold are possible. Examples of detecting particulates with high precision by application of the apparatus and methods of operation are described.

  5. Equivalent Sensor Radiance Generation and Remote Sensing from Model Parameters. Part 1; Equivalent Sensor Radiance Formulation

    Science.gov (United States)

    Wind, Galina; DaSilva, Arlindo M.; Norris, Peter M.; Platnick, Steven E.

    2013-01-01

    In this paper we describe a general procedure for calculating equivalent sensor radiances from variables output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint the algorithm takes explicit account of the model subgrid variability, in particular its description of the probably density function of total water (vapor and cloud condensate.) The equivalent sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies. We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products.) We focus on clouds and cloud/aerosol interactions, because they are very important to model development and improvement.

  6. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  7. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  8. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  9. Radiometric, geometric, and image quality assessment of ALOS AVNIR-2 and PRISM sensors

    Science.gov (United States)

    Saunier, S.; Goryl, P.; Chander, G.; Santer, R.; Bouvet, M.; Collet, B.; Mambimba, A.; Kocaman, Aksakal S.

    2010-01-01

    The Advanced Land Observing Satellite (ALOS) was launched on January 24, 2006, by a Japan Aerospace Exploration Agency (JAXA) H-IIA launcher. It carries three remote-sensing sensors: 1) the Advanced Visible and Near-Infrared Radiometer type 2 (AVNIR-2); 2) the Panchromatic Remote-Sensing Instrument for Stereo Mapping (PRISM); and 3) the Phased-Array type L-band Synthetic Aperture Radar (PALSAR). Within the framework of ALOS Data European Node, as part of the European Space Agency (ESA), the European Space Research Institute worked alongside JAXA to provide contributions to the ALOS commissioning phase plan. This paper summarizes the strategy that was adopted by ESA to define and implement a data verification plan for missions operated by external agencies; these missions are classified by the ESA as third-party missions. The ESA was supported in the design and execution of this plan by GAEL Consultant. The verification of ALOS optical data from PRISM and AVNIR-2 sensors was initiated 4 months after satellite launch, and a team of principal investigators assembled to provide technical expertise. This paper includes a description of the verification plan and summarizes the methodologies that were used for radiometric, geometric, and image quality assessment. The successful completion of the commissioning phase has led to the sensors being declared fit for operations. The consolidated measurements indicate that the radiometric calibration of the AVNIR-2 sensor is stable and agrees with the Landsat-7 Enhanced Thematic Mapper Plus and the Envisat MEdium-Resolution Imaging Spectrometer calibration. The geometrical accuracy of PRISM and AVNIR-2 products improved significantly and remains under control. The PRISM modulation transfer function is monitored for improved characterization.

  10. Pioneer Venus Star Sensor. [active despin control application

    Science.gov (United States)

    Gutshall, R. L.; Thomas, G.

    1979-01-01

    The design predictions and orbital performance verification of the solid state star scanner used in the Onboard Attitude Control of the Pioneer Venus Orbiter and Multiprobe are presented. The star sensor extended the scanner use to active despin control, and it differs from previous sensors in solid state detection, redundant electronics for reliability, larger field of view, and large dynamic spin range. The star scanner hardware and design performance based on the ability to predict all noise sources, signal magnitudes, and expected detection probability are discussed. In-flight data collected to verify sensor ground calibration are tabulated and plotted in predicted accuracy curves. It is concluded that the Pioneer Venus Star Sensor has demonstrated predictable star calibration in the range of .1 magnitude uncertainties and usable star catalogs of 100 stars with very high probabilities of detection, which were much better than expected and well within the mission requirements.

  11. Remote sensing and geoinformation technologies in support of nuclear non-proliferation and arms control verification regimes

    Energy Technology Data Exchange (ETDEWEB)

    Niemeyer, Irmgard [Forschungszentrum Juelich GmbH, Institut fuer Energie- und Klimaforschung, IEK-6: Nukleare Entsorgung und Reaktorsicherheit (Germany)

    2013-07-01

    A number of international agreements and export control regimes have been concluded in order to reduce the risk and proliferation of weapons of mass destruction. In order to provide confidence that Member States are complying with the agreed commitments, most of the treaties and agreements include verification provisions. Different types of verification measures exist, e.g. cooperative measures; national technical means; technical monitoring or measurement devices placed at or near sites; on-site inspections; intelligence information; open-source information, such as commercial internet data and satellite imagery. The study reviews the technical progress in the field of satellite imaging sensors and explores the recent advances in satellite imagery processing and geoinformation technologies as to the extraction of significant observables and signatures. Moreover, it discusses how satellite data and geoinformation technologies could be used complementary for confirming information gathered from other systems or sources. The study also aims at presenting the legal and political aspects and the cost benefits of using imagery from both national and commercial satellites in the verification procedure. The study concludes that satellite imagery and geoinformation technologies are expected to enhance the verification efficiency and effectiveness.

  12. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  13. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  14. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  15. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  16. Temperature-independent fiber-Bragg-grating-based atmospheric pressure sensor

    Science.gov (United States)

    Zhang, Zhiguo; Shen, Chunyan; Li, Luming

    2018-03-01

    Atmospheric pressure is an important way to achieve a high degree of measurement for modern aircrafts, moreover, it is also an indispensable parameter in the meteorological telemetry system. With the development of society, people are increasingly concerned about the weather. Accurate and convenient atmospheric pressure parameters can provide strong support for meteorological analysis. However, electronic atmospheric pressure sensors currently in application suffer from several shortcomings. After an analysis and discussion, we propose an innovative structural design, in which a vacuum membrane box and a temperature-independent strain sensor based on an equal strength cantilever beam structure and fiber Bragg grating (FBG) sensors are used. We provide experimental verification of that the atmospheric pressure sensor device has the characteristics of a simple structure, lack of an external power supply, automatic temperature compensation, and high sensitivity. The sensor system has good sensitivity, which can be up to 100 nm/MPa, and repeatability. In addition, the device exhibits desired hysteresis.

  17. TomoTherapy MLC verification using exit detector data

    Energy Technology Data Exchange (ETDEWEB)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu [TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado 80045 (United States); Xinghua Cancer Hospital, Xinghua, Jiangsu 225700 (China); Department of Radiation Oncology, University of California-Los Angeles, Los Angeles, California 90095 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States)

    2012-01-15

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment

  18. TomoTherapy MLC verification using exit detector data

    International Nuclear Information System (INIS)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu

    2012-01-01

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment systems

  19. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  20. Key Management in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ismail Mansour

    2015-09-01

    Full Text Available Wireless sensor networks are a challenging field of research when it comes to security issues. Using low cost sensor nodes with limited resources makes it difficult for cryptographic algorithms to function without impacting energy consumption and latency. In this paper, we focus on key management issues in multi-hop wireless sensor networks. These networks are easy to attack due to the open nature of the wireless medium. Intruders could try to penetrate the network, capture nodes or take control over particular nodes. In this context, it is important to revoke and renew keys that might be learned by malicious nodes. We propose several secure protocols for key revocation and key renewal based on symmetric encryption and elliptic curve cryptography. All protocols are secure, but have different security levels. Each proposed protocol is formally proven and analyzed using Scyther, an automatic verification tool for cryptographic protocols. For efficiency comparison sake, we implemented all protocols on real testbeds using TelosB motes and discussed their performances.

  1. Concept of Operations for Nuclear Warhead Embedded Sensors

    Energy Technology Data Exchange (ETDEWEB)

    Rockett, P D; Koncher, T R

    2012-05-16

    Embedded arms-control-sensors provide a powerful new paradigm for managing compliance with future nuclear weapons treaties, where deployed warhead numbers will be reduced to 1000 or less. The CONOPS (Concept of Operations) for use with these sensors is a practical tool with which one may help define design parameters, including size, power, resolution, communications, and physical structure. How frequently must data be acquired and must a human be present? Will such data be acquired for only stored weapons or will it be required of deployed weapons as well? Will tactical weapons be subject to such monitoring or will only strategic weapons apply? Which data will be most crucial? Will OSI's be a component of embedded sensor data management or will these sensors stand alone in their data extraction processes? The problem space is massive, but can be constrained by extrapolating to a reasonable future treaty regime and examining the bounded options this scenario poses. Arms control verification sensors, embedded within the warhead case or aeroshell, must provide sufficient but not excessively detailed data, confirming that the item is a nuclear warhead and that it is a particular warhead without revealing sensitive information. Geolocation will be provided by an intermediate transceiver used to acquire the data and to forward the data to a central processing location. Past Chain-of-Custody projects have included such devices and will be primarily responsible for adding such indicators in the future. For the purposes of a treaty regime a TLI will be verified as a nuclear warhead by knowledge of (a) the presence and mass of SNM, (b) the presence of HE, and (c) the reporting of a unique tag ID. All of these parameters can be obtained via neutron correlation measurements, Raman spectroscopy, and fiber optic grating fabrication, respectively. Data from these sensors will be pushed out monthly and acquired nearly daily, providing one of several verification layers in

  2. Real-time sensor failure detection by dynamic modelling of a PWR plant

    International Nuclear Information System (INIS)

    Turkcan, E.; Ciftcioglu, O.

    1992-06-01

    Signal validation and sensor failure detection is an important problem in real-time nuclear power plant (NPP) surveillance. Although conventional sensor redundancy, in a way, is a solution, identification of faulty sensor is necessary for further preventive actions to be taken. A comprehensive solution for the system so that any sensory reading is verified by its model based estimated counterpart, in real-time. Such a realization is accomplished by means of dynamic system's states estimation methodology using Kalman filter modelling technique. The method is investigated by means of real-time data of the steam generator of Borssele nuclear power plant and the method has proved to be satisfactory for real-time sensor failure detection as well as model validation verification. (author). 5 refs.; 6 figs.; 1 tab

  3. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  4. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  5. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  6. Currency verification by a 2D infrared barcode

    International Nuclear Information System (INIS)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-01-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler. (technical design note)

  7. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  8. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  9. Frequency selective surface based passive wireless sensor for structural health monitoring

    International Nuclear Information System (INIS)

    Jang, Sang-Dong; Kang, Byung-Woo; Kim, Jaehwan

    2013-01-01

    Wireless sensor networks or ubiquitous sensor networks are a promising technology giving useful information to people. In particular, the chipless passive wireless sensor is one of the most important developments in wireless sensor technology because it is compact and does not need a battery or chip for the sensor operation. So it has many possibilities for use in various types of sensor system with economical efficiency and robustness in harsh environmental conditions. This sensor uses an electromagnetic resonance frequency or phase angle shift associated with a geometrical change of the sensor tag or an impedance change of the sensor. In this paper, a chipless passive wireless structural health monitoring (SHM) sensor is made using a frequency selective surface (FSS). The cross type FSS is introduced, and its SHM principle is explained. The electromagnetic characteristics of the FSS are simulated in terms of transmission and reflection coefficients using simulation software, and an experimental verification is conducted. The electromagnetic characteristic change of the FSS in the presence of mechanical strain or a structural crack is investigated by means of simulation and experiment. Since large-area structures can be covered by deploying FSS, it is possible to detect the location of any cracks. (paper)

  10. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  11. Application of Wireless Sensor Networks to Automobiles

    Science.gov (United States)

    Tavares, Jorge; Velez, Fernando J.; Ferro, João M.

    2008-01-01

    Some applications of Wireless Sensor Networks (WSNs) to the automobile are identified, and the use of Crossbow MICAz motes operating at 2.4 GHz is considered together with TinyOS support. These WSNs are conceived in order to measure, process and supply to the user diverse types of information during an automobile journey. Examples are acceleration and fuel consumption, identification of incorrect tire pressure, verification of illumination, and evaluation of the vital signals of the driver. A brief survey on WSNs concepts is presented, as well as the way the wireless sensor network itself was developed. Calibration curves were produced which allowed for obtaining luminous intensity and temperature values in the appropriate units. Aspects of the definition of the architecture and the choice/implementation of the protocols are identified. Security aspects are also addressed.

  12. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  13. Flood simulation and verification with IoT sensors

    Science.gov (United States)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  14. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  15. Monitoring system of hydraulic lifting device based on the fiber optic sensors

    Science.gov (United States)

    Fajkus, Marcel; Nedoma, Jan; Novak, Martin; Martinek, Radek; Vanus, Jan; Mec, Pavel; Vasinek, Vladimir

    2017-10-01

    This article deals with the description of the monitoring system of hydraulic lifting device based on the fiber-optic sensors. For minimize the financial costs of the proposed monitoring system, the power evaluation of measured signal has been chosen. The solution is based on an evaluation of the signal obtained using the single point optic fiber sensors with overlapping reflective spectra. For encapsulation of the sensors was used polydimethylsiloxane (PDMS) polymer. To obtain a information of loading is uses the action of deformation of the lifting device on the pair single point optic fiber sensors mounted on the lifting device of the tested car. According to the proposed algorithm is determined information of pressure with an accuracy of +/- 5 %. Verification of the proposed system was realized on the various types of the tested car with different loading. The original contribution of the paper is to verify the new low-cost system for monitoring the hydraulic lifting device based on the fiber-optic sensors.

  16. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  17. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  19. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  20. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  1. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  2. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  3. Modeling a Consistent Behavior of PLC-Sensors

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2014-01-01

    Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.

  4. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  5. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  6. Optimization and Verification of the TR-MAC Protocol for Wireless Sensor Networks

    NARCIS (Netherlands)

    Morshed, S.; Heijenk, Geert

    2015-01-01

    Energy-efficiency is an important requirement in the design of communication protocols for wireless sensor networks (WSN). TR-MAC is an energy-efficient medium access control (MAC) layer protocol for low power WSN that exploits transmitted-reference (TR) modulation in the physical layer. The

  7. Experimental study on performance verification tests for coordinate measuring systems with optical distance sensors

    Science.gov (United States)

    Carmignato, Simone

    2009-01-01

    Optical sensors are increasingly used for dimensional and geometrical metrology. However, the lack of international standards for testing optical coordinate measuring systems is currently limiting the traceability of measurements and the easy comparison of different optical systems. This paper presents an experimental investigation on artefacts and procedures for testing coordinate measuring systems equipped with optical distance sensors. The work is aimed at contributing to the standardization of testing methods. The VDI/VDE 2617-6.2:2005 guideline, which is probably the most complete document available at the state of the art for testing systems with optical distance sensors, is examined with specific experiments. Results from the experiments are discussed, with particular reference to the tests used for determining the following characteristics: error of indication for size measurement, probing error and structural resolution. Particular attention is given to the use of artefacts alternative to gauge blocks for determining the error of indication for size measurement.

  8. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  9. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  11. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  12. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  13. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  14. Actuation stability test of the LISA pathfinder inertial sensor front-end electronics

    Science.gov (United States)

    Mance, Davor; Gan, Li; Weber, Bill; Weber, Franz; Zweifel, Peter

    In order to limit the residual stray forces on the inertial sensor test mass in LISA pathfinder, √ it is required that the fluctuation of the test mass actuation voltage is within 2ppm/ Hz. The actuation voltage stability test on the flight hardware of the inertial sensor front-end electronics (IS FEE) is presented in this paper. This test is completed during the inertial sensor integration at EADS Astrium Friedrichshafen, Germany. The standard measurement method using voltmeter is not sufficient for verification, since the instrument low frequency √ fluctuation is higher than the 2ppm/ Hz requirement. In this test, by using the differential measurement method and the lock-in amplifier, the actuation stability performance is verified and the quality of the IS FEE hardware is confirmed by the test results.

  15. CSRQ: Communication-Efficient Secure Range Queries in Two-Tiered Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hua Dai

    2016-02-01

    Full Text Available In recent years, we have seen many applications of secure query in two-tiered wireless sensor networks. Storage nodes are responsible for storing data from nearby sensor nodes and answering queries from Sink. It is critical to protect data security from a compromised storage node. In this paper, the Communication-efficient Secure Range Query (CSRQ—a privacy and integrity preserving range query protocol—is proposed to prevent attackers from gaining information of both data collected by sensor nodes and queries issued by Sink. To preserve privacy and integrity, in addition to employing the encoding mechanisms, a novel data structure called encrypted constraint chain is proposed, which embeds the information of integrity verification. Sink can use this encrypted constraint chain to verify the query result. The performance evaluation shows that CSRQ has lower communication cost than the current range query protocols.

  16. A Survey on Security and Privacy in Emerging Sensor Networks: From Viewpoint of Close-Loop

    Science.gov (United States)

    Zhang, Lifu; Zhang, Heng

    2016-01-01

    Nowadays, as the next generation sensor networks, Cyber-Physical Systems (CPSs) refer to the complex networked systems that have both physical subsystems and cyber components, and the information flow between different subsystems and components is across a communication network, which forms a closed-loop. New generation sensor networks are found in a growing number of applications and have received increasing attention from many inter-disciplines. Opportunities and challenges in the design, analysis, verification and validation of sensor networks co-exists, among which security and privacy are two important ingredients. This paper presents a survey on some recent results in the security and privacy aspects of emerging sensor networks from the viewpoint of the closed-loop. This paper also discusses several future research directions under these two umbrellas. PMID:27023559

  17. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  18. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  19. TECHNICAL DESIGN NOTE: Currency verification by a 2D infrared barcode

    Science.gov (United States)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-10-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler.

  20. Smartphone User Identity Verification Using Gait Characteristics

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2016-09-01

    Full Text Available Smartphone-based biometrics offers a wide range of possible solutions, which could be used to authenticate users and thus to provide an extra level of security and theft prevention. We propose a method for positive identification of smartphone user’s identity using user’s gait characteristics captured by embedded smartphone sensors (gyroscopes, accelerometers. The method is based on the application of the Random Projections method for feature dimensionality reduction to just two dimensions. Then, a probability distribution function (PDF of derived features is calculated, which is compared against known user PDF. The Jaccard distance is used to evaluate distance between two distributions, and the decision is taken based on thresholding. The results for subject recognition are at an acceptable level: we have achieved a grand mean Equal Error Rate (ERR for subject identification of 5.7% (using the USC-HAD dataset. Our findings represent a step towards improving the performance of gait-based user identity verification technologies.

  1. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  2. Precision temperature monitoring (PTM) and Humidity monitoring (HM) sensors of the CMS electromagnetic calorimeter

    CERN Multimedia

    2006-01-01

    A major aspect for the ECAL detector control is the monitoring of the system temperature and the verification that the required temperature stability of the crystal volume and the APDs, expected to be (18 ± 0.05)C, is achieved. The PTM is designed to read out thermistors, placed on both the front and back of the crystals, with a relative precision better than 0.01 C. In total there are ten sensors per supermodule. The humidity level in the electronics compartment is monitored by the HM system and consists of one humidity sensor per module.

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  5. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  6. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  7. SENSOR: a tool for the simulation of hyperspectral remote sensing systems

    Science.gov (United States)

    Börner, Anko; Wiest, Lorenz; Keller, Peter; Reulke, Ralf; Richter, Rolf; Schaepman, Michael; Schläpfer, Daniel

    The consistent end-to-end simulation of airborne and spaceborne earth remote sensing systems is an important task, and sometimes the only way for the adaptation and optimisation of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software Environment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray-tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. The third part consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimisation requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and first examples of its use are given. The verification of SENSOR is demonstrated. This work is closely related to the Airborne PRISM Experiment (APEX), an airborne imaging spectrometer funded by the European Space Agency.

  8. Design and Evaluation of a Wireless Sensor Network Based Aircraft Strength Testing System

    Science.gov (United States)

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system. PMID:22408521

  9. Design and evaluation of a wireless sensor network based aircraft strength testing system.

    Science.gov (United States)

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system.

  10. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  11. Analysis of using PDMS polymer as the sensors of the pressure or weight

    Science.gov (United States)

    Jargus, Jan; Nedoma, Jan; Fajkus, Marcel; Novak, Martin; Mec, Pavel; Cvejn, Daniel; Bujdos, David; Vasinek, Vladimir

    2017-10-01

    Polydimethylsiloxane (PDMS) can be used for its optical properties, and its composition offers the possibility of use in the diverse environments (industry, photonics, medicine applications, security devices and etc.). Therefore authors of this article focused on more detailed working with this material. This material could be use for the sensory applications such as the sensor of pressure or weight, which may find use also in the field of security and defense. The article describes the process of making the prototype of the sensor and its verification based on laboratory results. Measurement methodology is based on the determination of the change of optical power at the output of the sensor prototype depending on the change in pressure or weight. We estimate the maximum load of the sensor on the basis of the laboratory results in the units of tons. Using a calibration measurement can determine the amount of pressure and weight with an accuracy of +/- 2 %.

  12. Integrated fiber optic sensors for hot spot detection and temperature field reconstruction in satellites

    International Nuclear Information System (INIS)

    Rapp, S; Baier, H

    2010-01-01

    Large satellites are often equipped with more than 1000 temperature sensors during the test campaign. Hundreds of them are still used for monitoring during launch and operation in space. This means an additional mass and especially high effort in assembly, integration and verification on a system level. So the use of fiber Bragg grating temperature sensors is investigated as they offer several advantages. They are lightweight, small in size and electromagnetically immune, which fits well in space applications. Their multiplexing capability offers the possibility to build extensive sensor networks including dozens of sensors of different types, such as strain sensors, accelerometers and temperature sensors. The latter allow the detection of hot spots and the reconstruction of temperature fields via proper algorithms, which is shown in this paper. A temperature sensor transducer was developed, which can be integrated into satellite sandwich panels with negligible mechanical influence. Mechanical and thermal vacuum tests were performed to verify the space compatibility of the developed sensor system. Proper reconstruction algorithms were developed to estimate the temperature field and detect thermal hot spots on the panel surface. A representative hardware demonstrator has been built and tested, which shows the capability of using an integrated fiber Bragg grating temperature sensor network for temperature field reconstruction and hot spot detection in satellite structures

  13. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  14. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  15. Development of an In-Situ Decommissioning Sensor Network Test Bed for Structural Condition Monitoring - 12156

    Energy Technology Data Exchange (ETDEWEB)

    Zeigler, Kristine E.; Ferguson, Blythe A. [Savannah River National Laboratory, Aiken, South Carolina 29808 (United States)

    2012-07-01

    The Savannah River National Laboratory (SRNL) has established an In Situ Decommissioning (ISD) Sensor Network Test Bed, a unique, small scale, configurable environment, for the assessment of prospective sensors on actual ISD system material, at minimal cost. The Department of Energy (DOE) is presently implementing permanent entombment of contaminated, large nuclear structures via ISD. The ISD end state consists of a grout-filled concrete civil structure within the concrete frame of the original building. Validation of ISD system performance models and verification of actual system conditions can be achieved through the development a system of sensors to monitor the materials and condition of the structure. The ISD Sensor Network Test Bed has been designed and deployed to addresses the DOE-Environmental Management Technology Need to develop a remote monitoring system to determine and verify ISD system performance. Commercial off-the-shelf sensors have been installed on concrete blocks taken from walls of the P Reactor Building at the Savannah River Site. Deployment of this low-cost structural monitoring system provides hands-on experience with sensor networks. The initial sensor system consists of groutable thermistors for temperature and moisture monitoring, strain gauges for crack growth monitoring, tilt-meters for settlement monitoring, and a communication system for data collection. Baseline data and lessons learned from system design and installation and initial field testing will be utilized for future ISD sensor network development and deployment. The Sensor Network Test Bed at SRNL uses COTS sensors on concrete blocks from the outer wall of the P Reactor Building to measure conditions expected to occur in ISD structures. Knowledge and lessons learned gained from installation, testing, and monitoring of the equipment will be applied to sensor installation in a meso-scale test bed at FIU and in future ISD structures. The initial data collected from the sensors

  16. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  17. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  18. First investigations on the safety evaluation of smart sensors

    Energy Technology Data Exchange (ETDEWEB)

    Bousquet, S.; Elsensohn, O. [CEA Fontenay aux Roses, 92 (France). Inst. de Protection et de Surete Nucleaire; Benoit, G. [CEA Saclay, Dir. de la Recherche Technologique DRT, 91 - Gif sur Yvette (France)

    2001-10-01

    IPSN (Institute for Protection and Nuclear Safety) is the technical support for the French nuclear safety authority and thus involved in the safety evaluation of new I and C technologies and particularly of smart sensors. Smart sensors are characterized by the use of a microprocessor that converts the process variable into digital signals and exchanges other information with I and C control systems. There are two types of smart sensors: HART (Highway Addressable Remote Transducer) sensors, which provide both analogue (4 to 20 mA) and digital signals, and network sensors, which provide only digital signals. The expected benefits for operators are improved accuracy and reliability and cost savings in installation, commissioning, testing and maintenance. Safety evaluation of these smart sensors raises new issues: How does the sensor react to unknown commands? How to avoid unexpected changes in configuration? What is its sensitivity to electromagnetic interferences (EMI), to radiations...? In order to evaluate whether these sensors can be qualified for a safety application and to define the qualification tests to be done, IPSN has planned some functional and hardware tests (EMI, radiations) on 'HART' and field bus sensors. During the functional tests, we were not able to disrupt the HART tested sensors by invalid commands. However, these results cannot be extended to other sensors, because of the use of different technology, of different versions of hardware and software and of constructors' specific commands. Furthermore, easy modifications of configuration parameters can cause additional failures. Environmental tests are in progress on HART sensors and will be followed by experiments on field bus sensors. These preliminary investigations and the latest incident initiated by an incorrect computing algorithm of digital switchgear at Ringhals NPP, clearly illustrate that testing and verification programmes for smart equipment must be meticulously designed

  19. First investigations on the safety evaluation of smart sensors

    International Nuclear Information System (INIS)

    Bousquet, S.; Elsensohn, O.

    2001-10-01

    IPSN (Institute for Protection and Nuclear Safety) is the technical support for the French nuclear safety authority and thus involved in the safety evaluation of new I and C technologies and particularly of smart sensors. Smart sensors are characterized by the use of a microprocessor that converts the process variable into digital signals and exchanges other information with I and C control systems. There are two types of smart sensors: HART (Highway Addressable Remote Transducer) sensors, which provide both analogue (4 to 20 mA) and digital signals, and network sensors, which provide only digital signals. The expected benefits for operators are improved accuracy and reliability and cost savings in installation, commissioning, testing and maintenance. Safety evaluation of these smart sensors raises new issues: How does the sensor react to unknown commands? How to avoid unexpected changes in configuration? What is its sensitivity to electromagnetic interferences (EMI), to radiations...? In order to evaluate whether these sensors can be qualified for a safety application and to define the qualification tests to be done, IPSN has planned some functional and hardware tests (EMI, radiations) on 'HART' and field bus sensors. During the functional tests, we were not able to disrupt the HART tested sensors by invalid commands. However, these results cannot be extended to other sensors, because of the use of different technology, of different versions of hardware and software and of constructors' specific commands. Furthermore, easy modifications of configuration parameters can cause additional failures. Environmental tests are in progress on HART sensors and will be followed by experiments on field bus sensors. These preliminary investigations and the latest incident initiated by an incorrect computing algorithm of digital switchgear at Ringhals NPP, clearly illustrate that testing and verification programmes for smart equipment must be meticulously designed and reviewed

  20. Autonomous Deployment and Restoration of Sensor Network using Mobile Robots

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Suzuki

    2010-09-01

    Full Text Available This paper describes an autonomous deployment and restoration of a Wireless Sensor Network (WSN using mobile robots. The authors have been developing an information-gathering system using mobile robots and WSNs in underground spaces in post-disaster environments. In our system, mobile robots carry wireless sensor nodes (SN and deploy them into the environment while measuring Received Signal Strength Indication (RSSI values to ensure communication, thereby enabling the WSN to be deployed and restored autonomously. If the WSN is disrupted, mobile robots restore the communication route by deploying additional or alternate SNs to suitable positions. Utilizing the proposed method, a mobile robot can deploy a WSN and gather environmental information via the WSN. Experimental results using a verification system equipped with a SN deployment and retrieval mechanism are presented.

  1. Autonomous Deployment and Restoration of Sensor Network using Mobile Robots

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Suzuki

    2010-06-01

    Full Text Available This paper describes an autonomous deployment and restoration of a Wireless Sensor Network (WSN using mobile robots. The authors have been developing an information-gathering system using mobile robots and WSNs in underground spaces in post-disaster environments. In our system, mobile robots carry wireless sensor nodes (SN and deploy them into the environment while measuring Received Signal Strength Indication (RSSI values to ensure communication, thereby enabling the WSN to be deployed and restored autonomously. If the WSN is disrupted, mobile robots restore the communication route by deploying additional or alternate SNs to suitable positions. Utilizing the proposed method, a mobile robot can deploy a WSN and gather environmental information via the WSN. Experimental results using a verification system equipped with a SN deployment and retrieval mechanism are presented.

  2. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  3. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  4. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  5. Fabrication of a printed capacitive air-gap touch sensor

    Science.gov (United States)

    Lee, Sang Hoon; Seo, Hwiwon; Lee, Sangyoon

    2018-05-01

    Unlike lithography-based processes, printed electronics does not require etching, which makes it difficult to fabricate electronic devices with an air gap. In this study, we propose a method to fabricate capacitive air-gap touch sensors via printing and coating. First, the bottom electrode was fabricated on a flexible poly(ethylene terephthalate) (PET) substrate using roll-to-roll gravure printing with silver ink. Then poly(dimethylsiloxane) (PDMS) was spin coated to form a sacrificial layer. The top electrode was fabricated on the sacrificial layer by spin coating with a stretchable silver ink. The sensor samples were then put in a tetrabutylammonium (TBAF) bath to generate the air gap by removing the sacrificial layer. The capacitance of the samples was measured for verification, and the results show that the capacitance increases in proportion to the applied force from 0 to 2.5 N.

  6. System-Level Modelling and Simulation of MEMS-Based Sensors

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Madsen, Jan; Shafique, Mohammad

    2005-01-01

    The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration with the......The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration...... with the existing embedded system design methodologies is possible. In this paper, we present a MEMS design methodology that uses VHDL-AMS based system-level model of a MEMS device as a starting point and combines the top-down and bottom-up design approaches for design, verification, and optimization...

  7. Sustainable Load-Balancing Scheme for Inter-Sensor Convergence Processing of Routing Cooperation Topology

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2016-05-01

    Full Text Available Recent advancements in Information Technology (IT have sparked the creation of numerous and diverse types of devices and services. Manual data collection measurement methods have been automated through the use of various wireless or wired sensors. Single sensor devices are included in smart devices such as smartphones. Data transmission is critical for big data collected from sensor nodes, such as Mobile Sensor Nodes (MSNs, where sensors move dynamically according to sensor mobility, or Fixed Sensor Nodes (FSNs, where sensor locations are decided by the users. False data transfer processing of big data results in topology lifespan reduction and data transfer delays. Hence, a variety of simulators and diverse load-balancing algorithms have been developed as protocol verification tools for topology lifespan maximization and effective data transfer processing. However, those previously developed simulators have limited functions, such as an event function for a specific sensor or a battery consumption rate test for sensor deployment. Moreover, since the previous load-balancing algorithms consider only the general traffic distribution and the number of connected nodes without considering the current topology condition, the sustainable load-balancing technique that takes into account the battery consumption rate of the dispersed sensor nodes is required. Therefore, this paper proposes the Sustainable Load-balancing Scheme (SLS, which maximizes the overall topology lifespan through effective and sustainable load-balancing of data transfer among the sensors. SLS is capable of maintaining an effective topology as it considers both the battery consumption rate of the sensors and the data transfer delay.

  8. High capacity fiber optic sensor networks using hybrid multiplexing techniques and their applications

    Science.gov (United States)

    Sun, Qizhen; Li, Xiaolei; Zhang, Manliang; Liu, Qi; Liu, Hai; Liu, Deming

    2013-12-01

    Fiber optic sensor network is the development trend of fiber senor technologies and industries. In this paper, I will discuss recent research progress on high capacity fiber sensor networks with hybrid multiplexing techniques and their applications in the fields of security monitoring, environment monitoring, Smart eHome, etc. Firstly, I will present the architecture of hybrid multiplexing sensor passive optical network (HSPON), and the key technologies for integrated access and intelligent management of massive fiber sensor units. Two typical hybrid WDM/TDM fiber sensor networks for perimeter intrusion monitor and cultural relics security are introduced. Secondly, we propose the concept of "Microstructure-Optical X Domin Refecltor (M-OXDR)" for fiber sensor network expansion. By fabricating smart micro-structures with the ability of multidimensional encoded and low insertion loss along the fiber, the fiber sensor network of simple structure and huge capacity more than one thousand could be achieved. Assisted by the WDM/TDM and WDM/FDM decoding methods respectively, we built the verification systems for long-haul and real-time temperature sensing. Finally, I will show the high capacity and flexible fiber sensor network with IPv6 protocol based hybrid fiber/wireless access. By developing the fiber optic sensor with embedded IPv6 protocol conversion module and IPv6 router, huge amounts of fiber optic sensor nodes can be uniquely addressed. Meanwhile, various sensing information could be integrated and accessed to the Next Generation Internet.

  9. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  10. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  11. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  12. Non-Invasive Fiber-Optic Biomedical Sensor for Basic Vital Sign Monitoring

    Directory of Open Access Journals (Sweden)

    Jan Nedoma

    2017-01-01

    Full Text Available This article focuses on the functionality verification of a novel non-invasive fibre-optic sensor monitoring basic vital signs such as Respiratory Rate (RR, Heart Rate (HR and Body Temperature (BT. The integration of three sensors in one unit is a unique solution patented by our research team. The integrated sensor is based on two Fiber Bragg Gratings (FBGs encapsulated inside an inert polymer (non-reactive to human skin called PolyDiMethylSiloxane (PDMS. The PDMS is beginning to find widespread applications in the biomedical field due to its desirable properties, especially its immunity to ElectroMagnetic Interference (EMI. The integrated sensor's functionality was verified by carrying out a series of laboratory experiments in 10 volunteer subjects after giving them a written informed consent. The Bland-Altman statistical analysis produced satisfactory accuracy for the respiratory and heart rate measurements and their respective reference signals in all test subjects. A total relative error of 0.31% was determined for body temperature measurements. The main contribution of this article is a proof-of-concept of a novel noninvasive fiber-optic sensor which could be used for basic vital sign monitoring. This sensor offers a potential to enhance and improve the comfort level of patients in hospitals and clinics and can even be considered for use in Magnetic Resonance Imaging (MRI environments.

  13. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  14. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  15. Sensor/signal monitoring and plant maintenance

    International Nuclear Information System (INIS)

    Ciftcioglu, Oe.; Tuerkcan, E.

    1994-02-01

    Nuclear Power Plant (NPO) availability is determined by the intended functionality of safety related system and components. Therefore, maintenance is an important issue in a power plant connected to the plant's reliability and safety. The traditional maintenance policies proved to be rather costly and even not effectively addressing NPP requirements. Referring to these drawbacks, in the last decade, in the nuclear reliability centered maintenance (RCM) gained substantial interest due to its merits. In the formal implementation of RCM, apparently, predictive maintenance is not considered. However, with the impact of modern real-time and on-line surveillance and monitoring methodologies, the predictive maintenance procedures like sensor/signal verification and validation are to be included into RCM. (orig.)

  16. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  17. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  18. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  19. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  20. A Dynamic Precision Evaluation Method for the Star Sensor in the Stellar-Inertial Navigation System.

    Science.gov (United States)

    Lu, Jiazhen; Lei, Chaohua; Yang, Yanqiang

    2017-06-28

    Integrating the advantages of INS (inertial navigation system) and the star sensor, the stellar-inertial navigation system has been used for a wide variety of applications. The star sensor is a high-precision attitude measurement instrument; therefore, determining how to validate its accuracy is critical in guaranteeing its practical precision. The dynamic precision evaluation of the star sensor is more difficult than a static precision evaluation because of dynamic reference values and other impacts. This paper proposes a dynamic precision verification method of star sensor with the aid of inertial navigation device to realize real-time attitude accuracy measurement. Based on the gold-standard reference generated by the star simulator, the altitude and azimuth angle errors of the star sensor are calculated for evaluation criteria. With the goal of diminishing the impacts of factors such as the sensors' drift and devices, the innovative aspect of this method is to employ static accuracy for comparison. If the dynamic results are as good as the static results, which have accuracy comparable to the single star sensor's precision, the practical precision of the star sensor is sufficiently high to meet the requirements of the system specification. The experiments demonstrate the feasibility and effectiveness of the proposed method.

  1. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  2. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  3. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  4. Fiber-optic evanescent-field sensor for attitude measurement

    Science.gov (United States)

    Liu, Yun; Chen, Shimeng; Liu, Zigeng; Guang, Jianye; Peng, Wei

    2017-11-01

    We proposed a new approach to attitude measurement by an evanescent field-based optical fiber sensing device and demonstrated a liquid pendulum. The device consisted of three fiber-optic evanescent-filed sensors which were fabricated by tapered single mode fibers and immersed in liquid. Three fiber Bragg gratings were used to measure the changes in evanescent field. And their reflection peaks were monitored in real time as measurement signals. Because every set of reflection responses corresponded to a unique attitude, the attitude of the device could be measured by the three fiber-optic evanescent-filed sensors. After theoretical analysis, computerized simulation and experimental verification, regular responses were obtained using this device for attitude measurement. The measurement ranges of dihedral angle and direction angle were 0°-50° and 0°-360°. The device is based on cost-effective power-referenced scheme. It can be used in electromagnetic or nuclear radiation environment.

  5. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  6. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  7. A Micro-Force Sensor with Beam-Membrane Structure for Measurement of Friction Torque in Rotating MEMS Machines

    Directory of Open Access Journals (Sweden)

    Huan Liu

    2017-10-01

    Full Text Available In this paper, a beam-membrane (BM sensor for measuring friction torque in micro-electro-mechanical system (MEMS gas bearings is presented. The proposed sensor measures the force-arm-transformed force using a detecting probe and the piezoresistive effect. This solution incorporates a membrane into a conventional four-beam structure to meet the range requirements for the measurement of both the maximum static friction torque and the kinetic friction torque in rotating MEMS machines, as well as eliminate the problem of low sensitivity with neat membrane structure. A glass wafer is bonded onto the bottom of the sensor chip with a certain gap to protect the sensor when overloaded. The comparisons between the performances of beam-based sensor, membrane-based sensor and BM sensor are conducted by finite element method (FEM, and the final sensor dimensions are also determined. Calibration of the fabricated and packaged device is experimentally performed. The practical verification is also reported in the paper for estimating the friction torque in micro gas bearings by assembling the proposed sensor into a rotary table-based measurement system. The results demonstrate that the proposed force sensor has a potential application in measuring micro friction or force in MEMS machines.

  8. A Vehicular Mobile Standard Instrument for Field Verification of Traffic Speed Meters Based on Dual-Antenna Doppler Radar Sensor.

    Science.gov (United States)

    Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue

    2018-04-05

    Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.

  9. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  10. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  11. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  12. An approach to calculating metal particle detection in lubrication oil based on a micro inductive sensor

    Science.gov (United States)

    Wu, Yu; Zhang, Hongpeng

    2017-12-01

    A new microfluidic chip is presented to enhance the sensitivity of a micro inductive sensor, and an approach to coil inductance change calculation is introduced for metal particle detection in lubrication oil. Electromagnetic knowledge is used to establish a mathematical model of an inductive sensor for metal particle detection, and the analytic expression of coil inductance change is obtained by a magnetic vector potential. Experimental verification is carried out. The results show that copper particles 50-52 µm in diameter have been detected; the relative errors between the theoretical and experimental values are 7.68% and 10.02% at particle diameters of 108-110 µm and 50-52 µm, respectively. The approach presented here can provide a theoretical basis for an inductive sensor in metal particle detection in oil and other areas of application.

  13. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  14. Container Verification Using Optically Stimulated Luminescence

    International Nuclear Information System (INIS)

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-01-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today's technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  15. Optimal placement of excitations and sensors for verification of large dynamical systems

    Science.gov (United States)

    Salama, M.; Rose, T.; Garba, J.

    1987-01-01

    The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

  16. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  17. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  18. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  19. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  20. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  1. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  2. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  3. A High-Temperature Piezoresistive Pressure Sensor with an Integrated Signal-Conditioning Circuit

    Directory of Open Access Journals (Sweden)

    Zong Yao

    2016-06-01

    Full Text Available This paper focuses on the design and fabrication of a high-temperature piezoresistive pressure sensor with an integrated signal-conditioning circuit, which consists of an encapsulated pressure-sensitive chip, a temperature compensation circuit and a signal-conditioning circuit. A silicon on insulation (SOI material and a standard MEMS process are used in the pressure-sensitive chip fabrication, and high-temperature electronic components are adopted in the temperature-compensation and signal-conditioning circuits. The entire pressure sensor achieves a hermetic seal and can be operated long-term in the range of −50 °C to 220 °C. Unlike traditional pressure sensor output voltage ranges (in the dozens to hundreds of millivolts, the output voltage of this sensor is from 0 V to 5 V, which can significantly improve the signal-to-noise ratio and measurement accuracy in practical applications of long-term transmission based on experimental verification. Furthermore, because this flexible sensor’s output voltage is adjustable, general follow-up pressure transmitter devices for voltage converters need not be used, which greatly reduces the cost of the test system. Thus, the proposed high-temperature piezoresistive pressure sensor with an integrated signal-conditioning circuit is expected to be highly applicable to pressure measurements in harsh environments.

  4. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  6. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  7. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  8. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  9. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  10. Lightweight Sensor Authentication Scheme for Energy Efficiency in Ubiquitous Computing Environments.

    Science.gov (United States)

    Lee, Jaeseung; Sung, Yunsick; Park, Jong Hyuk

    2016-12-01

    The Internet of Things (IoT) is the intelligent technologies and services that mutually communicate information between humans and devices or between Internet-based devices. In IoT environments, various device information is collected from the user for intelligent technologies and services that control the devices. Recently, wireless sensor networks based on IoT environments are being used in sectors as diverse as medicine, the military, and commerce. Specifically, sensor techniques that collect relevant area data via mini-sensors after distributing smart dust in inaccessible areas like forests or military zones have been embraced as the future of information technology. IoT environments that utilize smart dust are composed of the sensor nodes that detect data using wireless sensors and transmit the detected data to middle nodes. Currently, since the sensors used in these environments are composed of mini-hardware, they have limited memory, processing power, and energy, and a variety of research that aims to make the best use of these limited resources is progressing. This paper proposes a method to utilize these resources while considering energy efficiency, and suggests lightweight mutual verification and key exchange methods based on a hash function that has no restrictions on operation quantity, velocity, and storage space. This study verifies the security and energy efficiency of this method through security analysis and function evaluation, comparing with existing approaches. The proposed method has great value in its applicability as a lightweight security technology for IoT environments.

  11. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  12. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  13. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  14. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  15. Development Of A Sensor Network Test Bed For ISD Materials And Structural Condition Monitoring

    International Nuclear Information System (INIS)

    Zeigler, K.; Ferguson, B.; Karapatakis, D.; Herbst, C.; Stripling, C.

    2011-01-01

    The P Reactor at the Savannah River Site is one of the first reactor facilities in the US DOE complex that has been placed in its end state through in situ decommissioning (ISD). The ISD end state consists of a grout-filled concrete civil structure within the concrete frame of the original building. To evaluate the feasibility and utility of remote sensors to provide verification of ISD system conditions and performance characteristics, an ISD Sensor Network Test Bed has been designed and deployed at the Savannah River National Laboratory. The test bed addresses the DOE-EM Technology Need to develop a remote monitoring system to determine and verify ISD system performance. Commercial off-the-shelf sensors have been installed on concrete blocks taken from walls of the P Reactor Building. Deployment of this low-cost structural monitoring system provides hands-on experience with sensor networks. The initial sensor system consists of: (1) Groutable thermistors for temperature and moisture monitoring; (2) Strain gauges for crack growth monitoring; (3) Tiltmeters for settlement monitoring; and (4) A communication system for data collection. Preliminary baseline data and lessons learned from system design and installation and initial field testing will be utilized for future ISD sensor network development and deployment.

  16. DEVELOPMENT OF A SENSOR NETWORK TEST BED FOR ISD MATERIALS AND STRUCUTRAL CONDITION MONITORING

    Energy Technology Data Exchange (ETDEWEB)

    Zeigler, K.; Ferguson, B.; Karapatakis, D.; Herbst, C.; Stripling, C.

    2011-07-06

    The P Reactor at the Savannah River Site is one of the first reactor facilities in the US DOE complex that has been placed in its end state through in situ decommissioning (ISD). The ISD end state consists of a grout-filled concrete civil structure within the concrete frame of the original building. To evaluate the feasibility and utility of remote sensors to provide verification of ISD system conditions and performance characteristics, an ISD Sensor Network Test Bed has been designed and deployed at the Savannah River National Laboratory. The test bed addresses the DOE-EM Technology Need to develop a remote monitoring system to determine and verify ISD system performance. Commercial off-the-shelf sensors have been installed on concrete blocks taken from walls of the P Reactor Building. Deployment of this low-cost structural monitoring system provides hands-on experience with sensor networks. The initial sensor system consists of: (1) Groutable thermistors for temperature and moisture monitoring; (2) Strain gauges for crack growth monitoring; (3) Tiltmeters for settlement monitoring; and (4) A communication system for data collection. Preliminary baseline data and lessons learned from system design and installation and initial field testing will be utilized for future ISD sensor network development and deployment.

  17. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  18. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  19. Development and Commissioning Results of the Hybrid Sensor Bus Engineering Qualification Model

    Science.gov (United States)

    Hurni, Andreas; Putzer, Phillipp; Roner, Markus; Gurster, Markus; Hulsemeyer, Christian; Lemke, Norbert M. K.

    2016-08-01

    In order to reduce mass, AIT effort and overall costs of classical point-to-point wired temperature sensor harness on-board spacecraft OHB System AGhas introduced the Hybrid Sensor Bus (HSB) system which interrogates sensors connected in a bus architecture. To use the advantages of electrical as wellas of fiber-optical sensing technologies, HSB is designed as a modular measurement system interrogating digital sensors connected on electricalsensor buses based on I2C and fiber-optical sensor buses based on fiber Bragg grating (FBG) sensors inscribed in optical fibers. Fiber-optical sensor bus networks on-board satellites are well suited for temperature measurement due to low mass, electro-magnetic insensitivity and the capability to embed them inside structure parts. The lightweight FBG sensors inscribed in radiation tolerant fibers can reach every part of the satellite. HSB has been developed in the frame of the ESA ARTES program with European and German co- funding and will be verified as flight demonstrator on- board the German Heinrich Hertz satellite (H2Sat).In this paper the Engineering Qualification Model (EQM) development of HSB and first commissioning results are presented. For the HSB development requirements applicable for telecommunication satellite platforms have been considered. This includes an operation of at least 15 years in a geostationary orbit.In Q3/2016 the qualification test campaign is planned to be carried out. The HSB EQM undergoes a full qualification according to ECSS. The paper concludes with an outlook regarding this HSB flight demonstrator development and its in-orbit verification (IOV) on board H2Sat.

  20. Privacy in wireless sensor networks using ring signature

    Directory of Open Access Journals (Sweden)

    Ashmita Debnath

    2014-07-01

    Full Text Available The veracity of a message from a sensor node must be verified in order to avoid a false reaction by the sink. This verification requires the authentication of the source node. The authentication process must also preserve the privacy such that the node and the sensed object are not endangered. In this work, a ring signature was proposed to authenticate the source node while preserving its spatial privacy. However, other nodes as signers and their numbers must be chosen to preclude the possibility of a traffic analysis attack by an adversary. The spatial uncertainty increases with the number of signers but requires larger memory size and communication overhead. This requirement can breach the privacy of the sensed object. To determine the effectiveness of the proposed scheme, the location estimate of a sensor node by an adversary and enhancement in the location uncertainty with a ring signature was evaluated. Using simulation studies, the ring signature was estimated to require approximately four members from the same neighbor region of the source node to sustain the privacy of the node. Furthermore, the ring signature was also determined to have a small overhead and not to adversely affect the performance of the sensor network.

  1. Leak testing and repair of fusion devices

    International Nuclear Information System (INIS)

    Kozman, T.A.

    1983-01-01

    The leak testing, reporting and vacuum leak repair techniques of the MFTF yin-yang number one magnet system, the world's largest superconducting magnet system, are discussed. Based on this experience, techniques will be developed for testing and repairing leaks on the 42 MFTF-B magnets. The leak-hunting techniques for the yin-yang magnet systems were applied to two helium circuits (the coil bundle and guard vacuum; both require helium flow for magnet cooldown), their associated piping, liquid nitrogen radiation shields, and piping. Additionally, during MFTF-B operation there will be warm water plasma shields and piping that require leak checking

  2. Radiological safety design considerations for fusion research experiments

    International Nuclear Information System (INIS)

    Crase, K.W.; Singh, M.S.

    1979-01-01

    A wide variety of fusion research experiments are in the planning or construction stages. Two such experiments, the Nova Laser Fusion Facility and the Mirror Fusion Test Facility (MFTF), are currently under construction at Lawrence Livermore Laboratory. Although the plasma chamber vault for MFTF and the Nova target room will have thick concrete walls and roofs, the radiation safety problems are made complex by the numerous requirements for shield wall penetrations. This paper addresses radiation safety considerations for the MFTF and Nova experiments, and the need for integrated safety considerations and safety technology development during the planning stages of fusion experiments

  3. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  4. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  5. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  6. Verification test for three WindCube WLS7 LiDARs at the Høvsøre test site

    DEFF Research Database (Denmark)

    Gottschall, Julia; Courtney, Michael

    The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7-0062, and ......-0062, and in a summary for units WLS7-0064 and WLS7-0066. The verification test covers the evaluation of measured mean wind speeds, wind directions and wind speed standard deviations. The data analysis is basically performed in terms of different kinds of regression analyses.......The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7...

  7. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  8. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  9. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  10. Instrument surveillance and calibration verification through plant wide monitoring using autoassociative neural networks

    International Nuclear Information System (INIS)

    Wrest, D.J.; Hines, J.W.; Uhrig, R.E.

    1996-01-01

    The approach to instrument surveillance and calibration verification (ISCV) through plant wide monitoring proposed in this paper is an autoassociative neural network (AANN) which will utilize digitized data presently available in the Safety Parameter Display computer system from Florida Power Corporations Crystal River number 3 nuclear power plant. An autoassociative neural network is one in which the outputs are trained to emulate the inputs over an appropriate dynamic range. The relationships between the different variables are embedded in the weights by the training process. As a result, the output can be a correct version of an input pattern that has been distorted by noise, missing data, or non-linearities. Plant variables that have some degree of coherence with each other constitute the inputs to the network. Once the network has been trained with normal operational data it has been shown to successfully monitor the selected plant variables to detect sensor drift or failure by simply comparing the network inputs with the outputs. The AANN method of monitoring many variables not only indicates that there is a sensor failure, it clearly indicates the signal channel in which the signal error has occurred. (author). 11 refs, 8 figs, 2 tabs

  11. Instrument surveillance and calibration verification through plant wide monitoring using autoassociative neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Wrest, D J; Hines, J W; Uhrig, R E [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering

    1997-12-31

    The approach to instrument surveillance and calibration verification (ISCV) through plant wide monitoring proposed in this paper is an autoassociative neural network (AANN) which will utilize digitized data presently available in the Safety Parameter Display computer system from Florida Power Corporations Crystal River number 3 nuclear power plant. An autoassociative neural network is one in which the outputs are trained to emulate the inputs over an appropriate dynamic range. The relationships between the different variables are embedded in the weights by the training process. As a result, the output can be a correct version of an input pattern that has been distorted by noise, missing data, or non-linearities. Plant variables that have some degree of coherence with each other constitute the inputs to the network. Once the network has been trained with normal operational data it has been shown to successfully monitor the selected plant variables to detect sensor drift or failure by simply comparing the network inputs with the outputs. The AANN method of monitoring many variables not only indicates that there is a sensor failure, it clearly indicates the signal channel in which the signal error has occurred. (author). 11 refs, 8 figs, 2 tabs.

  12. Estimation of Curvature Changes for Steel-Concrete Composite Bridge Using Fiber Bragg Grating Sensors

    Directory of Open Access Journals (Sweden)

    Donghoon Kang

    2013-01-01

    Full Text Available This study is focused on the verification of the key idea of a newly developed steel-concrete composite bridge. The key idea of the proposed bridge is to reduce the design moment by applying vertical prestressing force to steel girders, so that a moment distribution of a continuous span bridge is formed in a simple span bridge. For the verification of the key technology, curvature changes of the bridge should be monitored sequentially at every construction stage. A pair of multiplexed FBG sensor arrays is proposed in order to measure curvature changes in this study. They are embedded in a full-scale test bridge and measured local strains, which are finally converted to curvatures. From the result of curvature changes, it is successfully ensured that the key idea of the proposed bridge, expected theoretically, is viable.

  13. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  14. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  15. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  16. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  17. The Performance Analysis of AN Indoor Mobile Mapping System with Rgb-D Sensor

    Science.gov (United States)

    Tsai, G. J.; Chiang, K. W.; Chu, C. H.; Chen, Y. L.; El-Sheimy, N.; Habib, A.

    2015-08-01

    Over the years, Mobile Mapping Systems (MMSs) have been widely applied to urban mapping, path management and monitoring and cyber city, etc. The key concept of mobile mapping is based on positioning technology and photogrammetry. In order to achieve the integration, multi-sensor integrated mapping technology has clearly established. In recent years, the robotic technology has been rapidly developed. The other mapping technology that is on the basis of low-cost sensor has generally used in robotic system, it is known as the Simultaneous Localization and Mapping (SLAM). The objective of this study is developed a prototype of indoor MMS for mobile mapping applications, especially to reduce the costs and enhance the efficiency of data collection and validation of direct georeferenced (DG) performance. The proposed indoor MMS is composed of a tactical grade Inertial Measurement Unit (IMU), the Kinect RGB-D sensor and light detection, ranging (LIDAR) and robot. In summary, this paper designs the payload for indoor MMS to generate the floor plan. In first session, it concentrates on comparing the different positioning algorithms in the indoor environment. Next, the indoor plans are generated by two sensors, Kinect RGB-D sensor LIDAR on robot. Moreover, the generated floor plan will compare with the known plan for both validation and verification.

  18. THE PERFORMANCE ANALYSIS OF AN INDOOR MOBILE MAPPING SYSTEM WITH RGB-D SENSOR

    Directory of Open Access Journals (Sweden)

    G. J. Tsai

    2015-08-01

    Full Text Available Over the years, Mobile Mapping Systems (MMSs have been widely applied to urban mapping, path management and monitoring and cyber city, etc. The key concept of mobile mapping is based on positioning technology and photogrammetry. In order to achieve the integration, multi-sensor integrated mapping technology has clearly established. In recent years, the robotic technology has been rapidly developed. The other mapping technology that is on the basis of low-cost sensor has generally used in robotic system, it is known as the Simultaneous Localization and Mapping (SLAM. The objective of this study is developed a prototype of indoor MMS for mobile mapping applications, especially to reduce the costs and enhance the efficiency of data collection and validation of direct georeferenced (DG performance. The proposed indoor MMS is composed of a tactical grade Inertial Measurement Unit (IMU, the Kinect RGB-D sensor and light detection, ranging (LIDAR and robot. In summary, this paper designs the payload for indoor MMS to generate the floor plan. In first session, it concentrates on comparing the different positioning algorithms in the indoor environment. Next, the indoor plans are generated by two sensors, Kinect RGB-D sensor LIDAR on robot. Moreover, the generated floor plan will compare with the known plan for both validation and verification.

  19. Experimental determination of temperatures of the inner wall of a boiler combustion chamber for the purpose of verification of a CFD model

    Directory of Open Access Journals (Sweden)

    Petr Trávníček

    2011-01-01

    Full Text Available The paper focuses on the non-destructive method of determination of temperatures in the boiler combustion chamber. This method proves to be significant mainly as regards CFD (Computational Fluid Dynamics simulations of combustion processes, in case of which it is subsequently advisable to verify the data calculated using CFD software application with the actually measured data. Verification of the method was based on usage of reference combustion equipment (130 kW which performs combustion of a mixture of waste sawdust and shavings originating in the course of production of wooden furniture. Measuring of temperatures inside the combustion chamber is – considering mainly the high temperature values – highly demanding and requires a special type of temperature sensors. Furthermore, as regards standard operation, it is not possible to install such sensors without performing structural alterations of the boiler. Therefore, for the purpose of determination of these temperatures a special experimental device was constructed while exploiting a thermal imaging system used for monitoring of the surface temperature of outer wall of the reference boiler. Temperatures on the wall of the boiler combustion chamber were determined on the basis of data measured using the experimental device as well as data from the thermal imaging system. These values might serve for verification of the respective CFD model of combustion equipment.

  20. Field portable petroleum analysis for validation of the site characterization and analysis penetrometer system petroleum, oil and lubricant sensor

    International Nuclear Information System (INIS)

    Davis, W.M.; Jones, P.; Porter, B.

    1995-01-01

    A petroleum, oil and lubricant (POL) sensor for the Site Characterization and Analysis Penetrometer System (SCAPS) has been developed by the Tri-Services (e.g. Army, Navy and Air Force) to characterize the distribution of POL contaminants on military sites. The sensor is based on the detection of POL contaminants using a laser induced fluorescence (LIF) spectrometer. The SCAPS POL sensor has been shown to be a valuable tool for the rapid screening of POL contamination in the subsurface. However, many factors can affect the LIF response of a particular fuel at a particular site. These include fuel type, age of spill (e.g. weathering) and soil type. The LIF sensor also detects fluorescence from any naturally occurring fluorophores, including humic substances and fluorescent minerals. These factors lead to the development of an independent procedure for the verification of the POL sensor response. This paper describes a field portable total recoverable petroleum hydrocarbon (TRPH) method based on EPA Method 418.1 and its application to on site validation of the SCAPS POL sensor response at a number of contaminated sites

  1. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  2. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  3. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  4. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  5. Detecting and Preventing Sybil Attacks in Wireless Sensor Networks Using Message Authentication and Passing Method.

    Science.gov (United States)

    Dhamodharan, Udaya Suriya Raj Kumar; Vayanaperumal, Rajamani

    2015-01-01

    Wireless sensor networks are highly indispensable for securing network protection. Highly critical attacks of various kinds have been documented in wireless sensor network till now by many researchers. The Sybil attack is a massive destructive attack against the sensor network where numerous genuine identities with forged identities are used for getting an illegal entry into a network. Discerning the Sybil attack, sinkhole, and wormhole attack while multicasting is a tremendous job in wireless sensor network. Basically a Sybil attack means a node which pretends its identity to other nodes. Communication to an illegal node results in data loss and becomes dangerous in the network. The existing method Random Password Comparison has only a scheme which just verifies the node identities by analyzing the neighbors. A survey was done on a Sybil attack with the objective of resolving this problem. The survey has proposed a combined CAM-PVM (compare and match-position verification method) with MAP (message authentication and passing) for detecting, eliminating, and eventually preventing the entry of Sybil nodes in the network. We propose a scheme of assuring security for wireless sensor network, to deal with attacks of these kinds in unicasting and multicasting.

  6. Detecting and Preventing Sybil Attacks in Wireless Sensor Networks Using Message Authentication and Passing Method

    Directory of Open Access Journals (Sweden)

    Udaya Suriya Raj Kumar Dhamodharan

    2015-01-01

    Full Text Available Wireless sensor networks are highly indispensable for securing network protection. Highly critical attacks of various kinds have been documented in wireless sensor network till now by many researchers. The Sybil attack is a massive destructive attack against the sensor network where numerous genuine identities with forged identities are used for getting an illegal entry into a network. Discerning the Sybil attack, sinkhole, and wormhole attack while multicasting is a tremendous job in wireless sensor network. Basically a Sybil attack means a node which pretends its identity to other nodes. Communication to an illegal node results in data loss and becomes dangerous in the network. The existing method Random Password Comparison has only a scheme which just verifies the node identities by analyzing the neighbors. A survey was done on a Sybil attack with the objective of resolving this problem. The survey has proposed a combined CAM-PVM (compare and match-position verification method with MAP (message authentication and passing for detecting, eliminating, and eventually preventing the entry of Sybil nodes in the network. We propose a scheme of assuring security for wireless sensor network, to deal with attacks of these kinds in unicasting and multicasting.

  7. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  8. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  9. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  10. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  11. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  12. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  13. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  14. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  15. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  16. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  17. Mirror Fusion Test Facility magnet

    International Nuclear Information System (INIS)

    Henning, C.H.; Hodges, A.J.; Van Sant, J.H.; Hinkle, R.E.; Horvath, J.A.; Hintz, R.E.; Dalder, E.; Baldi, R.; Tatro, R.

    1979-01-01

    The Mirror Fusion Test Facility (MFTF) is the largest of the mirror program experiments for magnetic fusion energy. It seeks to combine and extend the near-classical plasma confinement achieved in 2XIIB with the most advanced neutral-beam and magnet technologies. The product of ion density and confinement time will be improved more than an order of magnitude, while the superconducting magnet weight will be extrapolated from the 15 tons in Baseball II to 375 tons in MFTF. Recent reactor studies show that the MFTF will traverse much of the distance in magnet technology towards the reactor regime. Design specifics of the magnet are given

  18. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  19. Overcoming urban GPS navigation challenges through the use of MEMS inertial sensors and proper verification of navigation system performance

    Science.gov (United States)

    Vinande, Eric T.

    This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.

  20. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  1. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  2. Towards a model-based development approach for wireless sensor-actuator network protocols

    DEFF Research Database (Denmark)

    Kumar S., A. Ajith; Simonsen, Kent Inge

    2014-01-01

    Model-Driven Software Engineering (MDSE) is a promising approach for the development of applications, and has been well adopted in the embedded applications domain in recent years. Wireless Sensor Actuator Networks consisting of resource constrained hardware and platformspecific operating system...... induced due to manual translations. With the use of formal semantics in the modeling approach, we can further ensure the correctness of the source model by means of verification. Also, with the use of network simulators and formal modeling tools, we obtain a verified and validated model to be used...

  3. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  4. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  5. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  6. Technology of mirror machines: LLL facilities for magnetic mirror fusion experiments

    International Nuclear Information System (INIS)

    Batzer, T.H.

    1977-01-01

    Significant progress in plasma confinement and temperature has been achieved in the 2XIIB facility at Livermore. These encouraging results, and their theoretical corroboration, have provided a firm basis for the design of a new generation of magnetic mirror experiments, adding support to the mirror concept of a fusion reactor. Two new mirror experiments have been proposed to succeed the currently operating 2XIIB facility. The first of these called TMX (Tandem Mirror Experiment) has been approved and is currently under construction. TMX is designed to utilize the intrinsic positive plasma potential of two strong, and relatively small, minimum B mirror cells to enhance the confinement of a much larger, magnetically weaker, centrally-located mirror cell. The second facility, MFTF (Mirror Fusion Test Facility), is currently in preliminary design with line item approval anticipated for FY 78. MFTF is designed primarily to exploit the experimental and theoretical results derived from 2XIIB. Beyond that, MFTF will develop the technology for the transition from the present small mirror experiments to large steady-state devices such as the mirror FERF/FTR. The sheer magnitude of the plasma volume, magnetic field, neutral beam power, and vacuum pumping capacity, particularly in the case of MFTF, has placed new and exciting demands on engineering technology. An engineering overview of MFTF, TMX, and associated MFE activities at Livermore will be presented

  7. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  8. Integrated calibration of a 3D attitude sensor in large-scale metrology

    International Nuclear Information System (INIS)

    Gao, Yang; Lin, Jiarui; Yang, Linghui; Zhu, Jigui; Muelaner, Jody; Keogh, Patrick

    2017-01-01

    A novel calibration method is presented for a multi-sensor fusion system in large-scale metrology, which improves the calibration efficiency and reliability. The attitude sensor is composed of a pinhole prism, a converging lens, an area-array camera and a biaxial inclinometer. A mathematical model is established to determine its 3D attitude relative to a cooperative total station by using two vector observations from the imaging system and the inclinometer. There are two areas of unknown parameters in the measurement model that should be calibrated: the intrinsic parameters of the imaging model, and the transformation matrix between the camera and the inclinometer. An integrated calibration method using a three-axis rotary table and a total station is proposed. A single mounting position of the attitude sensor on the rotary table is sufficient to solve for all parameters of the measurement model. A correction technique for the reference laser beam of the total station is also presented to remove the need for accurate positioning of the sensor on the rotary table. Experimental verification has proved the practicality and accuracy of this calibration method. Results show that the mean deviations of attitude angles using the proposed method are less than 0.01°. (paper)

  9. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  10. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  11. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  12. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks†

    Science.gov (United States)

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-01-01

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromised master nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals. PMID:25615731

  13. Collusion-aware privacy-preserving range query in tiered wireless sensor networks.

    Science.gov (United States)

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-12-11

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  14. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaoying Zhang

    2014-12-01

    Full Text Available Wireless sensor networks (WSNs are indispensable building blocks for the Internet of Things (IoT. With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  15. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  16. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  17. Image intelligence online consulting: A flexible and remote access to strategic information applied to verification of declaration

    International Nuclear Information System (INIS)

    Chassy, A.F. de; Denizot, L.

    2001-01-01

    Commercial satellite imagery is giving International Institutions specialized Information Departments access to a great source of valuable intelligence. High resolution and multiple sensors have also led to a growing complexity of interpretation that calls for a greater need of consulting, verification and training in the field in order to make it eligible as an operational source of verification. Responding to this need, Fleximage is extending its Image intelligence (IMINT) training program to include a fully operational and flexible online consulting and training program. Image Intelligence (IMINT) Online Program, a new approach to acquiring IMINT expertise, supported by Internet technologies, and managed by a professional team of experts and technical staff. Fleximage has developed a virtual learning environment on the Internet for acquiring IMINT expertise. Called the IMINT Online Program, this dynamic learning environment provides complete flexibility and personalization of the process for acquiring expertise. The IMINT online program includes two services: Online Consulting and Online Training. The Online Consulting service is designed for the technical staff of an organization who are already operational in the field of image intelligence. Online Consulting enables these staff members to acquire pertinent expertise online that can be directly applied to their professional activity, such as IAEA verification tasks. The Online Training service is designed for the technical staff of an organization who are relatively new to the field of image intelligence. These staff members need to build expertise within a formal training program. Online Training is a flexible and structured program for acquiring IMINT expertise online

  18. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  19. New optical sensor systems for high-resolution satellite, airborne and terrestrial imaging systems

    Science.gov (United States)

    Eckardt, Andreas; Börner, Anko; Lehmann, Frank

    2007-10-01

    The department of Optical Information Systems (OS) at the Institute of Robotics and Mechatronics of the German Aerospace Center (DLR) has more than 25 years experience with high-resolution imaging technology. The technology changes in the development of detectors, as well as the significant change of the manufacturing accuracy in combination with the engineering research define the next generation of spaceborne sensor systems focusing on Earth observation and remote sensing. The combination of large TDI lines, intelligent synchronization control, fast-readable sensors and new focal-plane concepts open the door to new remote-sensing instruments. This class of instruments is feasible for high-resolution sensor systems regarding geometry and radiometry and their data products like 3D virtual reality. Systemic approaches are essential for such designs of complex sensor systems for dedicated tasks. The system theory of the instrument inside a simulated environment is the beginning of the optimization process for the optical, mechanical and electrical designs. Single modules and the entire system have to be calibrated and verified. Suitable procedures must be defined on component, module and system level for the assembly test and verification process. This kind of development strategy allows the hardware-in-the-loop design. The paper gives an overview about the current activities at DLR in the field of innovative sensor systems for photogrammetric and remote sensing purposes.

  20. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  1. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  2. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  3. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  4. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  5. Hybrid piezoresistive-optical tactile sensor for simultaneous measurement of tissue stiffness and detection of tissue discontinuity in robot-assisted minimally invasive surgery

    Science.gov (United States)

    Bandari, Naghmeh M.; Ahmadi, Roozbeh; Hooshiar, Amir; Dargahi, Javad; Packirisamy, Muthukumaran

    2017-07-01

    To compensate for the lack of touch during minimally invasive and robotic surgeries, tactile sensors are integrated with surgical instruments. Surgical tools with tactile sensors have been used mainly for distinguishing among different tissues and detecting malignant tissues or tumors. Studies have revealed that malignant tissue is most likely stiffer than normal. This would lead to the formation of a sharp discontinuity in tissue mechanical properties. A hybrid piezoresistive-optical-fiber sensor is proposed. This sensor is investigated for its capabilities in tissue distinction and detection of a sharp discontinuity. The dynamic interaction of the sensor and tissue is studied using finite element method. The tissue is modeled as a two-term Mooney-Rivlin hyperelastic material. For experimental verification, the sensor was microfabricated and tested under the same conditions as of the simulations. The simulation and experimental results are in a fair agreement. The sensor exhibits an acceptable linearity, repeatability, and sensitivity in characterizing the stiffness of different tissue phantoms. Also, it is capable of locating the position of a sharp discontinuity in the tissue. Due to the simplicity of its sensing principle, the proposed hybrid sensor could also be used for industrial applications.

  6. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  7. The Mirror Fusion Test Facility cryogenic system: Performance, management approach, and present equipment status

    International Nuclear Information System (INIS)

    Slack, D.S.; Chronis, W.C.

    1987-01-01

    The cryogenic system for the Mirror Fusion Test Facility (MFTF) is a 14-kW, 4.35-K helium refrigeration system that proved to be highly successful and cost-effective. All operating objectives were met, while remaining within a few percent of initial cost and schedule plans. The management approach used in MFTF allowed decisions to be made quickly and effectively, and it helped keep costs down. Manpower levels, extent and type of industrial participation, key aspects of subcontractor specifications, and subcontractor interactions are reviewed, as well as highlights of the system tests, operation, and present equipment status. Organizations planning large, high-technology systems may benefit from this experience with the MFTF cryogenic system

  8. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  9. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  10. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  11. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  12. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Science.gov (United States)

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  13. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  14. Wearable motion sensors to continuously measure real-world physical activities.

    Science.gov (United States)

    Dobkin, Bruce H

    2013-12-01

    Rehabilitation for sensorimotor impairments aims to improve daily activities, walking, exercise, and motor skills. Monitoring of practice and measuring outcomes, however, is usually restricted to laboratory-based procedures and self-reports. Mobile health devices may reverse these confounders of daily care and research trials. Wearable, wireless motion sensor data, analyzed by activity pattern-recognition algorithms, can describe the type, quantity, and quality of mobility-related activities in the community. Data transmission from the sensors to a cell phone and the Internet enable continuous monitoring. Remote access to laboratory quality data about walking speed, duration and distance, gait asymmetry and smoothness of movements, as well as cycling, exercise, and skills practice, opens new opportunities to engage patients in progressive, personalized therapies with feedback about the performance. Clinical trial designs will be able to include remote verification of the integrity of complex physical interventions and compliance with practice, as well as capture repeated, ecologically sound, ratio scale outcome measures. Given the progressively falling cost of miniaturized wearable gyroscopes, accelerometers, and other physiologic sensors, as well as inexpensive data transmission, sensing systems may become as ubiquitous as cell phones for healthcare. Neurorehabilitation can develop these mobile health platforms for daily care and clinical trials to improve exercise and fitness, skills learning, and physical functioning.

  15. Acoustic emission detection with fiber optical sensors for dry cask storage health monitoring

    Science.gov (United States)

    Lin, Bin; Bao, Jingjing; Yu, Lingyu; Giurgiutiu, Victor

    2016-04-01

    The increasing number, size, and complexity of nuclear facilities deployed worldwide are increasing the need to maintain readiness and develop innovative sensing materials to monitor important to safety structures (ITS). In the past two decades, an extensive sensor technology development has been used for structural health monitoring (SHM). Technologies for the diagnosis and prognosis of a nuclear system, such as dry cask storage system (DCSS), can improve verification of the health of the structure that can eventually reduce the likelihood of inadvertently failure of a component. Fiber optical sensors have emerged as one of the major SHM technologies developed particularly for temperature and strain measurements. This paper presents the development of optical equipment that is suitable for ultrasonic guided wave detection for active SHM in the MHz range. An experimental study of using fiber Bragg grating (FBG) as acoustic emission (AE) sensors was performed on steel blocks. FBG have the advantage of being durable, lightweight, and easily embeddable into composite structures as well as being immune to electromagnetic interference and optically multiplexed. The temperature effect on the FBG sensors was also studied. A multi-channel FBG system was developed and compared with piezoelectric based AE system. The paper ends with conclusions and suggestions for further work.

  16. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  17. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  18. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  19. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  20. The operator interface for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Lang, N.C.

    1986-12-01

    The uncertain and most likely changing nature of a large experimental facility like MFTF, as well as its large number of control and monitor points, ruled against the traditional hardware approach involving walls of knobs, dials, oscilloscopes, and strip chart recorders. Rather, from the beginning, project management specified computer control of all systems, and operation of the complete MFTF under an integrated computer control system became a major engineering goal. The Integrated Controls and Diagnostics (ICADS) group was charged with the design and implementation of this control system. We designed a control system with an extremely flexible operator interface which uses computer generated CRT displays for output and pointing devices such as touch sensitive CRT overlays, mice, and joysticks for input. Construction of MFTF was completed at the end of 1985 within the project budget of $241.6M and was followed immediately by a 5 month long acceptance test. During this period (known as PACE test) operators, engineers, and physicists successfully used our computer control system daily to test MFTF. Much of their willingness to forsake the traditional hands-on hardware approach to testing was a result of the powerful and flexible operator interface to the MFTF control system. In this paper, we describe the operator interface with emphasis on the displays, the touch screens, and the mouse. We also report the experiences of users and, in particular, stress those aspects of the user interface they strongly liked and disliked

  1. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  2. Working Group 2: Future Directions for Safeguards and Verification, Technology, Research and Development

    International Nuclear Information System (INIS)

    Zykov, S.; Blair, D.

    2013-01-01

    For traditional safeguards it was recognized that the hardware presently available is, in general, addressing adequately fundamental IAEA needs, and that further developments should therefore focus mainly on improving efficiencies (i.e. increasing cost economies, reliability, maintainability and user-friendliness, keeping abreast of continual advancements in technologies and of the evolution of verification approaches). Specific technology areas that could benefit from further development include: -) Non-destructive measurement systems (NDA), in particular, gamma-spectroscopy and neutron counting techniques; -) Containment and surveillance tools, such as tamper indicating seals, video-surveillance, surface identification methods, etc.; -) Geophysical methods for design information verification (DIV) and safeguarding of geological repositories; and -) New tools and methods for real-time monitoring. Furthermore, the Working Group acknowledged that a 'building block' (or modular) approach should be adopted towards technology development, enabling equipment to be upgraded efficiently as technologies advance. Concerning non-traditional safeguards, in the area of satellite-based sensors, increased spatial resolution and broadened spectral range were identified as priorities. In the area of wide area surveillance, the development of LIDAR-like tools for atmospheric sensing was discussed from the perspective of both potential benefits and certain limitations. Recognizing the limitations imposed by the human brain in terms of information assessment and analysis, technologies are needed that will enable the more effective utilization of all information, regardless of its format and origin. The paper is followed by the slides of the presentation. (A.C.)

  3. Image intelligence online consulting: A flexible and remote access to strategic information applied to verification of declaration

    International Nuclear Information System (INIS)

    Chassy, A.F. de; Denizot, L.

    2001-01-01

    Commercial satellite imagery is giving International Institutions specialized Information Departments access to a great source of valuable intelligence. High resolution and multiple sensors have also led to a growing complexity of interpretation that calls for a greater need of consulting, verification and training in the field in order to make it eligible as an operational source of verification. Responding to this need, Fleximage is extending its Image Intelligence (IMINT) training program to include a fully operational and flexible online consulting and training program. Image Intelligence (IMINT) Online Program, a new approach to acquiring IMINT expertise, supported by Internet technologies, and managed by a professional team of experts and technical staff. Fleximage has developed a virtual learning environment on the Internet for acquiring IMINT expertise. Called the IMINT Online Program, this dynamic learning environment provides complete flexibility and personalization of the process for acquiring expertise. The IMINT online program includes two services: Online Consulting and Online Training. The Online Consulting service is designed for the technical staff of an organization who are already operational in the field of image intelligence. Online Consulting enables these staff members to acquire pertinent expertise online that can be directly applied to their professional activity, such as IAEA verification tasks. The IMINT virtual Consulting and Training services indicated above are made possible thanks to the latest in Internet-based technologies including: multimedia CD-ROM, Internet technologies, rich media content (Audio, Video and Flash), application sharing, platform Maintenance Tools, secured connections and authentication, knowledge database technologies. IMINT Online Program operates owing to: specialized experts in fields relating to IMINT. These experts carry out the tasks of consultants, coaches, occasional speakers, and course content designers

  4. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  5. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  6. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  7. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  8. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  9. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  10. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  11. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  12. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  13. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  14. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  15. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  16. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  17. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  18. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  19. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  20. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  1. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  3. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  4. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  5. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  6. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  7. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  8. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  9. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  10. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  11. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  12. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  13. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  14. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  15. Development of a wireless nonlinear wave modulation spectroscopy (NWMS) sensor node for fatigue crack detection

    Science.gov (United States)

    Liu, Peipei; Yang, Suyoung; Lim, Hyung Jin; Park, Hyung Chul; Ko, In Chang; Sohn, Hoon

    2014-03-01

    Fatigue crack is one of the main culprits for the failure of metallic structures. Recently, it has been shown that nonlinear wave modulation spectroscopy (NWMS) is effective in detecting nonlinear mechanisms produced by fatigue crack. In this study, an active wireless sensor node for fatigue crack detection is developed based on NWMS. Using PZT transducers attached to a target structure, ultrasonic waves at two distinctive frequencies are generated, and their modulation due to fatigue crack formation is detected using another PZT transducer. Furthermore, a reference-free NWMS algorithm is developed so that fatigue crack can be detected without relying on history data of the structure with minimal parameter adjustment by the end users. The algorithm is embedded into FPGA, and the diagnosis is transmitted to a base station using a commercial wireless communication system. The whole design of the sensor node is fulfilled in a low power working strategy. Finally, an experimental verification has been performed using aluminum plate specimens to show the feasibility of the developed active wireless NWMS sensor node.

  16. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.

    Science.gov (United States)

    Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-05-22

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  17. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hongfei Zhu

    2018-05-01

    Full Text Available With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  18. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  19. SIVEH: Numerical Computing Simulation of Wireless Energy-Harvesting Sensor Nodes

    Directory of Open Access Journals (Sweden)

    Pedro Yuste

    2013-09-01

    Full Text Available The paper presents a numerical energy harvesting model for sensor nodes, SIVEH (Simulator I–V for EH, based on I–V hardware tracking. I–V tracking is demonstrated to be more accurate than traditional energy modeling techniques when some of the components present different power dissipation at either different operating voltages or drawn currents. SIVEH numerical computing allows fast simulation of long periods of time—days, weeks, months or years—using real solar radiation curves. Moreover, SIVEH modeling has been enhanced with sleep time rate dynamic adjustment, while seeking energy-neutral operation. This paper presents the model description, a functional verification and a critical comparison with the classic energy approach.

  20. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  1. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  2. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  3. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  4. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  5. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  6. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  7. Mirror fusion test facility

    International Nuclear Information System (INIS)

    Post, R.F.

    1978-01-01

    The MFTF is a large new mirror facility under construction at Livermore for completion in 1981--82. It represents a scaleup, by a factor of 50 in plasma volume, a factor of 5 or more in ion energy, and a factor of 4 in magnetic field intensity over the Livermore 2XIIB experiment. Its magnet, employing superconducting NbTi windings, is of Yin-Yang form and will weigh 200 tons. MFTF will be driven by neutral beams of two levels of current and energy: 1000 amperes of 20 keV (accelerating potential) pulsed beams for plasma startup; 750 amperes of 80 keV beams of 0.5 second duration for temperature buildup and plasma sustainment. Two operating modes for MFTF are envisaged: The first is operation as a conventional mirror cell with n/sup tau/ approximately equal to 10 12 cm -3 sec, W/sub i/ = 50 keV, where the emphasis will be on studying the physics of mirror cells, particularly the issues of improved techniques of stabilization against ion cyclotron modes and of maximization of the electron temperature. The second possible mode is the further study of the Field Reversed Mirror idea, using high current neutral beams to sustain the field-reversed state. Anticipating success in the coming Livermore Tandem Mirror Experiment (TMX) MFTF has been oriented so that it could comprise one end cell of a scaled up TM experiment. Also, if MFTF were to succeed in achieving a FR state it could serve as an essentially full-sized physics prototype of one cell of a FRM fusion power plant

  8. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  9. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  10. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  11. Operation of the cryogenic system for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Chronis, W.C.; Slack, D.S.

    1987-01-01

    The cryogenic system for the Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory (LLNL) was designed to cool the entire MFTF-B system from ambient to operating temperature in less than 10 days. The system was successfully operated in the recent plant and capital equipment (PACE) acceptance tests, and results from these tests helped us correct problem areas and improve the system

  12. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  13. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  14. Development of a thinned back-illuminated CMOS active pixel sensor for extreme ultraviolet spectroscopy and imaging in space science

    International Nuclear Information System (INIS)

    Waltham, N.R.; Prydderch, M.; Mapson-Menard, H.; Pool, P.; Harris, A.

    2007-01-01

    We describe our programme to develop a large-format, science-grade, monolithic CMOS active pixel sensor for future space science missions, and in particular an extreme ultraviolet (EUV) spectrograph for solar physics studies on ESA's Solar Orbiter. Our route to EUV sensitivity relies on adapting the back-thinning and rear-illumination techniques first developed for CCD sensors. Our first large-format sensor consists of 4kx3k 5 μm pixels fabricated on a 0.25 μm CMOS imager process. Wafer samples of these sensors have been thinned by e2v technologies with the aim of obtaining good sensitivity at EUV wavelengths. We present results from both front- and back-illuminated versions of this sensor. We also present our plans to develop a new sensor of 2kx2k 10 μm pixels, which will be fabricated on a 0.35 μm CMOS process. In progress towards this goal, we have designed a test-structure consisting of six arrays of 512x512 10 μm pixels. Each of the arrays has been given a different pixel design to allow verification of our models, and our progress towards optimizing a design for minimal system readout noise and maximum dynamic range. These sensors will also be back-thinned for characterization at EUV wavelengths

  15. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  16. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  17. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  18. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  19. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  20. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  1. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  2. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  3. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  4. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  5. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  6. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  7. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  8. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  9. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  10. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  11. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  12. Viral vectors for gene modification of plants as chem/bio sensors.

    Energy Technology Data Exchange (ETDEWEB)

    Manginell, Monica; Harper, Jason C.; Arango, Dulce C.; Brozik, Susan Marie; Dolan, Patricia L.

    2006-11-01

    Chemical or biological sensors that are specific, sensitive, and robust allowing intelligence gathering for verification of nuclear non-proliferation treaty compliance and detouring production of weapons of mass destruction are sorely needed. Although much progress has been made in the area of biosensors, improvements in sensor lifetime, robustness, and device packaging are required before these devices become widely used. Current chemical and biological detection and identification techniques require less-than-covert sample collection followed by transport to a laboratory for analysis. In addition to being expensive and time consuming, results can often be inconclusive due to compromised sample integrity during collection and transport. We report here a demonstration of a plant based sensor technology which utilizes mature and seedling plants as chemical sensors. One can envision genetically modifying native plants at a site of interest that can report the presence of specific toxins or chemicals. In this one year project we used a developed inducible expression system to show the feasibility of plant sensors. The vector was designed as a safe, non-infectious vector which could be used to invade, replicate, and introduce foreign genes into mature host plants that then allow the plant to sense chem/bio agents. The genes introduced through the vector included a reporter gene that encodes for green fluorescent protein (GFP) and a gene that encodes for a mammalian receptor that recognizes a chemical agent. Specifically, GFP was induced by the presence of 17-{beta}-Estradiol (estrogen). Detection of fluorescence indicated the presence of the target chemical agent. Since the sensor is a plant, costly device packaging development or manufacturing of the sensor were not required. Additionally, the biological recognition and reporting elements are maintained in a living, natural environment and therefore do not suffer from lifetime disadvantages typical of most biosensing

  13. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  14. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  15. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  16. A Structural Scale for the Factors of Waste Sensors and Transducers Recycling Based on Consumer Satisfaction

    Directory of Open Access Journals (Sweden)

    Ming Ke

    2014-01-01

    Full Text Available This article first introduced the research results of both domestic and foreign scholars on the factors of waste sensors and transducers recycling, and in consideration of the four main bodies in waste sensors and transducers recycling, 14 influencing indicators of waste sensors and transducers recycling are extracted. Then this paper designed a questionnaire according to the 15 indicators of waste home appliance recycling, and put it on a research website. After verification of reliability and validity of the questionnaire, this paper analyzed the influencing factors of waste sensors and transducers recycling by using SPSS 13.0. Finally this article used factor analysis method to identify the representative factors. Two factors are concluded: Factor 1 mainly represents laws and regulations of government, governmental subsidy, governmental technology support, governmental market guidance, governmental monitor and control, recycling knowledge publication by government, social responsibilities of producers and recyclers, technique disposition ability of producers and recyclers, recyclers' service, therefore it could be summarized as government and enterprise disposition capability; while Factor 2 mainly represents consumers' benefit from recycling, convenience of consumers' recycling, mental satisfaction of consumers from recycling, consumers' recycling knowledge, social recycling environment, and thus they could be summarized as consumer incentive factor. This paper would provide some references for the analysis and research on influencing factors of waste sensors and transducers recycling.

  17. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  18. Impact of sensor-scene interaction on the design of an IR security surveillance system

    International Nuclear Information System (INIS)

    Claassen, J.P.; Phipps, G.S.

    1982-01-01

    Recent encouraging developments in infrared staring arrays with CCD readouts and in real time image processors working on and off the focal plane have suggested that technologies suitable for infrared security surveillance may be available in a two-to-five year time frame. In anticipation of these emerging technologies, an investigation has been undertaken to establish the design potential of a passive IR perimeter security system incorporating both detection and verification capabilities. To establish the design potential, it is necessary to characterize the interactions between the scene ad the sensor. To this end, theoretical and experimental findings were employed to document (1) the emission properties of scenes to include an intruder, (2) the propagation and emission characteristics of the intervening atmosphere, and (3) the reception properties of the imaging sensor. The impact of these findings are summarized in the light of the application constraints. Optimal wavelengths, intruder and background emission characteristics, weather limitations, and basic sensor design considerations are treated. Although many system design features have been identified to this date, continued efforts are required to complete a detailed system design to include the identifying processing requirements. A program to accomplish these objectives is presented

  19. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  20. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the