WorldWideScience

Sample records for mftf sensor verification

  1. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  2. MFTF exception handling system

    International Nuclear Information System (INIS)

    Nowell, D.M.; Bridgeman, G.D.

    1979-01-01

    In the design of large experimental control systems, a major concern is ensuring that operators are quickly alerted to emergency or other exceptional conditions and that they are provided with sufficient information to respond adequately. This paper describes how the MFTF exception handling system satisfies these requirements. Conceptually exceptions are divided into one of two classes. Those which affect command status by producing an abort or suspend condition and those which fall into a softer notification category of report only or operator acknowledgement requirement. Additionally, an operator may choose to accept an exception condition as operational, or turn off monitoring for sensors determined to be malfunctioning. Control panels and displays used in operator response to exceptions are described

  3. Manufacturing the MFTF magnet

    International Nuclear Information System (INIS)

    Dalder, E.N.C.; Hinkle, R.E.; Hodges, A.J.

    1980-01-01

    The Mirror Fusion Test Facility (MFTF) is a large mirror program experiment for magnetic fusion energy. It will combine and extend the near-classical plasma confinement achieved in 2XIIB with advanced neutral-beam and magnet technologies. The product of ion density and confinement time will be improved more than an order of magnitude, while the superconducting magnet weight will be extrapolated from 15 tons in Baseball II to 375 tons in MFTF. Recent reactor studies show that the MFTF will traverse much of the distance in magnet technology towards the reactor regime

  4. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  5. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  6. MFTF magnet cryostability

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1979-01-01

    A pair of large superconducting magnets will be installed in the Mirror Fusion Test Facility (MFTF), which is to begin operation in 1981. To ensure a stable superconducting state for the niobium-titanium (Nb-Ti) conductor, special consideration has been given to certain aspects of the magnet system design. These include the conductor, joints, coil assembly, vapor plenums, liquid-helium (LHe) supply system, and current leads. Heat transfer is the main consideration; i.e., the helium quality and temperature are limited so that the superconductor will perform satisfactorily in the magnet environment

  7. MFTF-progress and promise

    International Nuclear Information System (INIS)

    Thomassen, K.I.

    1980-01-01

    The Mirror Fusion Test Facility (MFTF) has been in construction at Lawrence Livermore National Laboratory (LLNL) for 3 years, and most of the major subsystems are nearing completion. Recently, the scope of this project was expanded to meet new objectives, principally to reach plasma conditions corresponding to energy break-even. To fulfill this promise, the single-cell minimum-B mirror configuration will be replaced with a tandem mirror configuration (MFTF-B). The facility must accordingly be expanded to accomodate the new geometry. This paper briefly discusses the status of the major MFTF subsystems and describes how most of the technological objectives of MFTF will be demonstrated before we install the additional systems necessary to make the tandem. It also summarizes the major features of the expanded facility

  8. MFTF-B plasma-diagnostic system

    International Nuclear Information System (INIS)

    Throop, A.L.; Goerz, D.A.; Thomas, S.R.

    1981-01-01

    This paper describes the current design status of the plasma diagnostic system for MFTF-B. In this paper we describe the system requirement changes which have occurred as a result of the funded rescoping of the original MFTF facility into MFTF-B. We outline the diagnostic instruments which are currently planned, and present an overview of the diagnostic system

  9. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  10. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  11. Changing MFTF vacuum environment

    International Nuclear Information System (INIS)

    Margolies, D.; Valby, L.

    1982-12-01

    The Mirror Fusion Test Facility (MFTF) vacuum vessel will be about 60m long and 10m in diameter at the widest point. The allowable operating densities range from 2 x 10 9 to 5 x 10 10 particles per cc. The maximum leak rate of 10 - 6 tl/sec is dominated during operation by the deliberately injected cold gas of 250 tl/sec. This gas is pumped by over 1000 square meters of cryopanels, external sorption pumps and getters. The design and requirements have changed radically over the past several years, and they are still not in final form. The vacuum system design has also changed, but more slowly and less radically. This paper discusses the engineering effort necessary to meet these stringent and changing requirements. Much of the analysis of the internal systems has been carried out using a 3-D Monte Carlo computer code, which can estimate time dependent operational pressures. This code and its use will also be described

  12. Changing MFTF vacuum environment

    International Nuclear Information System (INIS)

    Margolies, D.; Valby, L.

    1982-01-01

    The Mirror Fusion Test Facility (MFTF) vaccum vessel will be about 60m long and 10m in diameter at the widest point. The allowable operating densities range from 2 x 10 9 to 5 x 10 10 particles per cc. The maximum leak rate of 10 -6 tl/sec is dominated during operation by the deliberately injected cold gas of 250 tl/sec. This gas is pumped by over 1000 square meters of cryopanels, external sorbtion pumps and getters. The design and requirements have changed radically over the past several years, and they are still not in final form. The vacuum system design has also changed, but more slowly and less radically. This paper discusses the engineering effort necessary to meet these stringent and changing requirements. Much of the analysis of the internal systems has been carried out using a 3-D Monte Carlo computer code, which can estimate time dependent operational pressures. This code and its use will also be described

  13. Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Thomassen, K.I.

    1978-01-01

    A large, new Mirror Fusion Test Facility is under construction at LLL. Begun in FY78 it will be completed at the end of FY78 at a cost of $94.2M. This facility gives the mirror program the flexibility to explore mirror confinement principles at a signficant scale and advances the technology of large reactor-like devices. The role of MFTF in the LLL program is described here

  14. Thermal performance of the MFTF magnets

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1983-01-01

    A yin-yang pair of liquid-helium (LHe) cooled, superconducting magnets were tested last year at the Lawrence Livermore National Laboratory (LLNL) as part of a series of tests with the Mirror Fusion Test Facility (MFTF). These tests were performed to determine the success of engineering design used in major systems of the MFTF and to provide a technical base for rescoping from a single-mirror facility to the large tandem-mirror configuration (MFTF-B) now under construction. The magnets were cooled, operated at their design current and magnetic field, and warmed to atmospheric temperature. In this report, we describe their thermal behavior during these tests

  15. Assessment of stability characteristics of MFTF coils

    International Nuclear Information System (INIS)

    1979-03-01

    Certain aspects of the MFTF (Mirror Fusion Test Facility) conductor performance were investigated. Recovery analysis of the MFTF conductor was studied using GA's stability code. The maximum length of uncooled, unsoldered composite core which can recover from a thermal excursion was determined analytically. A maximum credible mechanical disturbance in terms of energy deposition, conductor motion and length, and time duration, was postulated. 5 references, 4 figures

  16. Design of the drift pumping system for MFTF-α+T

    International Nuclear Information System (INIS)

    Metlzer, D.H.

    1983-01-01

    Drift pumping in mirrors is a new concept (less than one year old). If it works, compared to charge-exchange pumping, it will simplify the MFTF-α+T interface and possibly reduce the circulating power required. From an engineering standpoint, it has some very demanding requirements in terms of power and bandwidth. This paper describes a design which satisfies these requirements. It also identifies a number of promising alternatives requiring investigation and verification

  17. Testing of the MFTF magnets

    International Nuclear Information System (INIS)

    Kozman, T.A.; Chang, Y.; Dalder, E.N.C.

    1982-01-01

    This paper describes the cooldown and testing of the first yin-yang magnet for the Mirror Fusion Test Facility. The introduction describes the superconducting magnet; the rest of the paper explains the tests prior to and including magnet cooldown and final acceptance testing. The MFTF (originally MX) was proposed in 1976 and the project was funded for construction start in October 1977. Construction of the first large superconducting magnet set was completed in May 1981 and testing started shortly thereafter. The acceptance test procedures were reviewed in May 1981 and the cooldown and final acceptance test were done by the end of February 1982. During this acceptance testing the magnet achieved its full design current and field

  18. MFTF-. cap alpha. + T progress report

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.D. (ed.)

    1985-04-01

    Early in FY 1983, several upgrades of the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) were proposed to the fusion community. The one most favorably received was designated MFTF-..cap alpha..+T. The engineering design of this device, guided by LLNL, has been a principal activity of the Fusion Engineering Design Center during FY 1983. This interim progress report represents a snapshot of the device design, which was begun in FY 1983 and will continue for several years. The report is organized as a complete design description. Because it is an interim report, some parts are incomplete; they will be supplied as the design study proceeds. As described in this report, MFTF-..cap alpha..+T uses existing facilities, many MFTF-B components, and a number of innovations to improve on the physics parameters of MFTF-B. It burns deuterium-tritium and has a central-cell Q of 2, a wall loading GAMMA/sub n/ of 2 MW/m/sup 2/ (with a central-cell insert module), and an availability of 10%. The machine is fully shielded, allows hands-on maintenance of components outside the vacuum vessel 24 h after shutdown, and has provisions for repair of all operating components.

  19. Overview of the MFTF electrical systems

    International Nuclear Information System (INIS)

    Lindquist, W.B.; Eckard, R.D.; Holdsworth, T.; Mooney, L.J.; Moyer, D.R.; Peterson, R.L.; Shimer, D.W.; Wyman, R.H.; VanNess, H.W.

    1979-01-01

    The Mirror Fusion Test Facility, scheduled for completion in October 1981, will contain a complex, state-of-the-art array of electrical and electronics equipment valued at over 60 M$. Three injector systems will be employed to initiate and sustain the MFTF deuterium plasma. A plasma streaming system and a startup neutron beam system will be used to establish a target plasma. A sustaining neutral beam system will be used to fuel and sustain the MFTF plasma for 0.5 s. Additional power supply systems required on MFTF include two magnet power supplies with quench protection circuitry for powering the superconducting YIN/YANG magnet pair and eight 10 KHz power supplies for powering the Ti gettering system. Due to the complexity, physical size, and multiple systems of MFTF, a distributed, hierarchial, computer control and instrumentation system will be used. Color graphic, touch-panel, control consoles will provide the man-machine interface. The MFTF will have the capability of conducting an experiment every five minutes

  20. MFTF-α + T progress report

    International Nuclear Information System (INIS)

    Nelson, W.D.

    1985-04-01

    Early in FY 1983, several upgrades of the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) were proposed to the fusion community. The one most favorably received was designated MFTF-α+T. The engineering design of this device, guided by LLNL, has been a principal activity of the Fusion Engineering Design Center during FY 1983. This interim progress report represents a snapshot of the device design, which was begun in FY 1983 and will continue for several years. The report is organized as a complete design description. Because it is an interim report, some parts are incomplete; they will be supplied as the design study proceeds. As described in this report, MFTF-α+T uses existing facilities, many MFTF-B components, and a number of innovations to improve on the physics parameters of MFTF-B. It burns deuterium-tritium and has a central-cell Q of 2, a wall loading GAMMA/sub n/ of 2 MW/m 2 (with a central-cell insert module), and an availability of 10%. The machine is fully shielded, allows hands-on maintenance of components outside the vacuum vessel 24 h after shutdown, and has provisions for repair of all operating components

  1. Design of the electromagnetic fluctuations diagnostic for MFTF-B

    International Nuclear Information System (INIS)

    House, P.A.; Goerz, D.A.; Martin, R.

    1983-01-01

    The Electromagnetic Fluctuations (EMF) diagnostic will be used to monitor ion fluctuations which could be unstable in MFTF-B. Each probe assembly includes a high impedance electrostatic probe to measure potential fluctuations, and a group of nested, single turn loops to measure magnetic fluctuations in three directions. Eventually, more probes and loops will be added to each probe assembly for making more detailed measurements. The sensors must lie physically close to the plasma edge and are radially positionable. Also, probes at separate axial locations can be positioned to connect along the same magnetic field line. These probes are similar in concept to the rf probes used on TMX, but the high thermal load for 30-second shots on MFTF-B requires a water-cooled design along with temperature monitors. Each signal channel has a bandwidth of .001 to 150 MHz and is monitored by up to four different data channels which obtain amplitude and frequency information. This paper describes the EMF diagnostic and presents the detailed mechanical and electrical designs

  2. MFTF-α + T shield design

    International Nuclear Information System (INIS)

    Gohar, Y.

    1985-01-01

    MFTF-α+T is a DT upgrade option of the Tandem Mirror Fusion Test Facility (MFTF-B) to study better plasma performance, and test tritium breeding blankets in an actual fusion reactor environment. The central cell insert, designated DT axicell, has a 2-MW/m 2 neutron wall loading at the first wall for blanket testing. This upgrade is completely shielded to protect the reactor components, the workers, and the general public from the radiation environment during operation and after shutdown. The shield design for this upgrade is the subject of this paper including the design criteria and the tradeoff studies to reduce the shield cost

  3. Evaluating and tuning system response in the MFTF-B control and diagnostics computers

    International Nuclear Information System (INIS)

    Palasek, R.L.; Butner, D.N.; Minor, E.G.

    1983-01-01

    The software system running on the Supervisory Control and Diagnostics System (SCDS) of MFTF-B is, for the major part, an event driven one. Regular, periodic polling of sensors' outputs takes place only at the local level, in the sensors' corresponding local control microcomputers (LCC's). An LCC reports a sensor's value to the supervisory computer only if there was a significant change. This report is passed as a message, routed among and acted upon by a network of applications and systems tasks within the supervisory computer (SCDS). Commands from the operator's console are similarly routed through a network of tasks, but in the oppostie direction to the experiment's hardware. In a network such as this, response time is partialy determined by system traffic. Because the hardware of MFTF-B will not be connected to the computer system for another two years, we are using the local control computers to simulate the event driven traffic that we expect to see during MFTF-B operation. In this paper we show how we are using the simulator to measure and evaluate response, loading, throughput, and utilization of components within the computer system. Measurement of the system under simulation allows us to identify bottlenecks and verify their unloosening. We also use the traffic simulators to evaluate prototypes of different algorithms for selected tasks, comparing their responses under the spectrum of traffic intensities

  4. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  5. MFTF test coil construction and performance

    International Nuclear Information System (INIS)

    Cornish, D.N.; Zbasnik, J.P.; Leber, R.L.; Hirzel, D.G.; Johnston, J.E.; Rosdahl, A.R.

    1978-01-01

    A solenoid coil, 105 cm inside the 167 cm outside diameter, has been constructed and tested to study the performance of the stabilized Nb--Ti conductor to be used in the Mirror Fusion Test Facility (MFTF) being built at Lawrence Livermore Laboratory. The insulation system of the test coil is identical to that envisioned for MFTF. Cold-weld joints were made in the conductor at the start and finish of each layer; heaters were fitted to some of these joints and also to the conductor at various locations in the winding. This paper gives details of the construction of the coil and the results of the tests carried out to determine its propagation and recovery characteristics

  6. Design of the MFTF external vacuum system

    International Nuclear Information System (INIS)

    Holl, P.M.

    1979-01-01

    As a result of major experiment success in the LLL mirror program on start-up and stabilization of plasmas in minimum-B magnetic geometry, a Mirror Fusion Test Facility (MFTF) is under construction. Completion is scheduled for September, 1981. MFTF will be used to bridge the gap between present day small mirror experiments and future fusion-reactor activity based on magnetic mirrors. The focal point of the Mirror Fusion Test Facility is the 35 foot diameter by 60 foot long vacuum vessel which encloses the superconducting magnets. High vacuum conditions in the vessel are required to establish and maintain a plasma, and to create and deliver energetic neutral atoms to heat the plasma at the central region

  7. Date base management system for the MFTF

    International Nuclear Information System (INIS)

    Choy, J.H.; Wade, J.A.

    1979-01-01

    The data base management system (DBMS) for the Mirror Fusion Test Facility (MFTF) is described as relational in nature and distributed across the nine computers of the supervisory control and diagnostics system. This paper deals with a reentrant runtime package of routines that are used to access data items, the data structures to support the runtime package, and some of the utilities in support of the DBMS

  8. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  9. The Linearity of Optical Tomography: Sensor Model and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Siti Zarina MOHD. MUJI

    2011-09-01

    Full Text Available The aim of this paper is to show the linearization of optical sensor. Linearity of the sensor response is a must in optical tomography application, which affects the tomogram result. Two types of testing are used namely, testing using voltage parameter and testing with time unit parameter. For the former, the testing is by measuring the voltage when the obstacle is placed between transmitter and receiver. The obstacle diameters are between 0.5 until 3 mm. The latter is also the same testing but the obstacle is bigger than the former which is 59.24 mm and the testing purpose is to measure the time unit spend for the ball when it cut the area of sensing circuit. Both results show a linear relation that proves the optical sensors is suitable for process tomography application.

  10. Supervisory control software for MFTF neutral beams

    International Nuclear Information System (INIS)

    Woodruff, J.P.

    1981-01-01

    We describe the software structures that control the operation of MFTF Sustaining Neutral Beam Power Supplies (SNBPS). These components of the Supervisory Control and Diagnostics System (SCDS) comprise ten distinct tasks that exist in the SCDS system environment. The codes total about 16,000 lines of commented Pascal code and occupy 240 kbytes of memory. The controls have been running since March 1981, and at this writing are being integrated to the Local Control System and to the power supply Pulse Power Module Controller

  11. Central cell confinement in MFTF-B

    International Nuclear Information System (INIS)

    Jong, R.A.

    1981-01-01

    The point code TANDEM has been used to survey the range of plasma parameters which can be attained in MFTF-B. The code solves for the electron and ion densities and temperatures in the central cell, yin-yang, barrier, and A-cell regions as well as the plasma potential in each region. In these studies, the A-cell sloshing ion beams were fixed while the neutral beams in the yin-yang and central cell, the gas feed in the central cell, and the applied ECRH power β, central cell ion density and temperature, and the confining potential are discussed

  12. Man-machine interface for the MFTF

    International Nuclear Information System (INIS)

    Speckert, G.C.

    1979-01-01

    In any complex system, the interesting problems occur at the interface of dissimilar subsystems. Control of the Mirror Fusion Test Facility (MFTF) begins with the US Congress, which controls the dollars, which control the people, who control the nine top-level minicomputers, which control the 65 microprocessors, which control the hardware that controls the physics experiment. There are many interesting boundaries across which control must pass, and the one that this paper addresses is the man-machine one. For the MFTF, the man-machine interface consists of a system of seven control consoles, each allowing one operator to communicate with one minicomputer. These consoles are arranged in a hierarchical manner, and both hardware and software were designed in a top-down fashion. This paper describes the requirements and the design of the console system as a whole, as well as the design and operation of the hardware and software of each console, and examines the possible form of a future man-machine interface

  13. Man-machine interface for the MFTF

    Energy Technology Data Exchange (ETDEWEB)

    Speckert, G.C.

    1979-11-09

    In any complex system, the interesting problems occur at the interface of dissimilar subsystems. Control of the Mirror Fusion Test Facility (MFTF) begins with the US Congress, which controls the dollars, which control the people, who control the nine top-level minicomputers, which control the 65 microprocessors, which control the hardware that controls the physics experiment. There are many interesting boundaries across which control must pass, and the one that this paper addresses is the man-machine one. For the MFTF, the man-machine interface consists of a system of seven control consoles, each allowing one operator to communicate with one minicomputer. These consoles are arranged in a hierarchical manner, and both hardware and software were designed in a top-down fashion. This paper describes the requirements and the design of the console system as a whole, as well as the design and operation of the hardware and software of each console, and examines the possible form of a future man-machine interface.

  14. Magnetic shielding tests for MFTF-B neutral beamlines

    International Nuclear Information System (INIS)

    Kerns, J.; Fabyan, J.; Wood, R.; Koger, P.

    1983-01-01

    A test program to determine the effectiveness of various magnetic shielding designs for MFTF-B beamlines was established at Lawrence Livermore National Laboratory (LLNL). The proposed one-tenth-scale shielding-design models were tested in a uniform field produced by a Helmholtz coil pair. A similar technique was used for the MFTF source-injector assemblies, and the model test results were confirmed during the Technology Demonstration in 1982. The results of these tests on shielding designs for MFTF-B had an impact on the beamline design for MFTF-B. The iron-core magnet and finger assembly originally proposed were replaced by a simple, air-core, race-track-coil, bending magnet. Only the source injector needs to be magnetically shielded from the fields of approximately 400 gauss

  15. Flood simulation and verification with IoT sensors

    Science.gov (United States)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  16. A spheromak ignition experiment reusing Mirror Fusion Test Facility (MFTF) equipment

    International Nuclear Information System (INIS)

    Fowler, T.K.

    1993-01-01

    Based on available experimental results and theory, a scenario is presented to achieve ohmic ignition in a spheromak by slow (∼ 10 sec.) helicity injection using power from the Mirror Fusion Test Facility (MFTF) substation. Some of the other parts needed (vacuum vessel, coils, power supplies, pumps, shielded building space) might also be obtained from MFTF or other salvage, as well as some components needed for intermediate experiments for additional verification of the concept (especially confinement scaling). The proposed ignition experiment would serve as proof-of-principle for the spheromak DT fusion reactor design published by Hagenson and Krakowski, with a nuclear island cost about ten times less than a tokamak of comparable power. Designs at even higher power density and lower cost might be possible using Christofilos' concept of a liquid lithium blanket. Since all structures would be protected from neutrons by the lithium blanket and the tritium inventory can be reduced by continuous removal from the liquid blanket, environmental and safety characteristics appear to be favorable

  17. Start-up neutral-beam power supply system for MFTF

    International Nuclear Information System (INIS)

    Mooney, L.J.

    1979-01-01

    This paper describes some of the design features and considerations of the MFTF start-up neutral-beam power supplies. In particular, we emphasize features of the system that will ensure MFTF compatibility and achieve the required reliability/availability for the MFTF to be successful

  18. Computer language evaluation for MFTF SCDS

    International Nuclear Information System (INIS)

    Anderson, R.E.; McGoldrick, P.R.; Wyman, R.H.

    1979-01-01

    The computer languages available for the systems and application implementation on the Supervisory Control and Diagnostics System (SCDS) for the Mirror Fusion Test Facility (MFTF) were surveyed and evaluated. Four language processors, CAL (Common Assembly Language), Extended FORTRAN, CORAL 66, and Sequential Pascal (SPASCAL, a subset of Concurrent Pascal [CPASCAL]) are commercially available for the Interdata 7/32 and 8/32 computers that constitute the SCDS. Of these, the Sequential Pascal available from Kansas State University appears best for the job in terms of minimizing the implementation time, debugging time, and maintenance time. This improvement in programming productivity is due to the availability of a high-level, block-structured language that includes many compile-time and run-time checks to detect errors. In addition, the advanced data-types in language allow easy description of the program variables. 1 table

  19. Seismic analysis of the MFTF facility

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.

    1985-01-01

    Seismic analyses were performed on the Mirror Fusion Test Facility (MFTF-B) located at the Lawrence Livermore National Laboratory, Livermore, CA. The three major structures studied were the vacuum vessel, the concrete shielding vault, and the steel frame enclosure building. The analyses performed on these structures ranged from fixed-base response spectrum analyses to soil-structure interaction analyses including the effects of structure-to-structure interaction and foundation flexibility. The results of these studies showed that the presence of the vault significantly affects the response of the vessel; that modeling the flexibility of the vault footing is important when studying forces near the base of the wall; and that the vault had very little effect on the building response. (orig.)

  20. MFTF plasma diagnostics data acquisition system

    International Nuclear Information System (INIS)

    Davis, G.E.; Coffield, F.E.

    1979-01-01

    The initial goal of the Data Acquisition System (DAS) is to control 11 instruments chosen as the startup diagnostic set and to collect, process, and display the data that these instruments produce. These instruments are described in a paper by Stan Thomas, et. al. entitled ''MFTF Plasma Diagnostics System.'' The DAS must be modular and flexible enough to allow upgrades in the quantity of data taken by an instrument, and also to allow new instruments to be added to the system. This is particularly necessary to support a research project where needs and requirements may change rapidly as a result of experimental findings. Typically, the startup configuration of the diagnostic instruments will contain only a fraction of the planned detectors, and produce approximately one half the data that the expanded version is designed to generate. Expansion of the system will occur in fiscal year 1982

  1. Thermal control for the MFTF magnet

    International Nuclear Information System (INIS)

    Vansant, J.H.; Russ, R.M.

    1980-01-01

    The external dimensions of the Yin-Yang magnet of the Mirror Fusion Test Facility will be 7.8 by 8.5 by 8.5 m, and it will weigh approximately 300 tons. More than 8000 liters of circulating liquid helium will be required to maintain the nearly 50 km of superconductor at below 5.0 K while the latter carries almost 6000 A in a magnetic field of up to nearly 7.7 T. This paper describes several features of the thermal control plans for the Yin-Yang: (1) the proposed cooldown and warmup schedules for the MFTF and the procedure for regenerating external cooling surfaces (2) the design of an external quench resistor based on an estimate of the superconductor's maximum temperature and (3) the use of a computer model of liquid helium circulation in choosing pipe size for the liquid helium lines

  2. Display-management system for MFTF

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is controlled by 65 local control microcomputers which are supervised by a local network of nine 32-bit minicomputers. Associated with seven of the nine computers are state-of-the-art graphics devices, each with extensive local processing capability. These devices provide the means for an operator to interact with the control software running on the minicomputers. It is critical that the information the operator views accurately reflects the current state of the experiment. This information is integrated into dynamically changing pictures called displays. The primary organizational component of the display system is the software-addressable segment. The segments created by the display creation software are managed by display managers associated with each graphics device. Each display manager uses sophisticated storage management mechanisms to keep the proper segments resident in the local graphics device storage

  3. MFTF supervisory control and diagnostics system hardware

    International Nuclear Information System (INIS)

    Butner, D.N.

    1979-01-01

    The Supervisory Control and Diagnostics System (SCDS) for the Mirror Fusion Test Facility (MFTF) is a multiprocessor minicomputer system designed so that for most single-point failures, the hardware may be quickly reconfigured to provide continued operation of the experiment. The system is made up of nine Perkin-Elmer computers - a mixture of 8/32's and 7/32's. Each computer has ports on a shared memory system consisting of two independent shared memory modules. Each processor can signal other processors through hardware external to the shared memory. The system communicates with the Local Control and Instrumentation System, which consists of approximately 65 microprocessors. Each of the six system processors has facilities for communicating with a group of microprocessors; the groups consist of from four to 24 microprocessors. There are hardware switches so that if an SCDS processor communicating with a group of microprocessors fails, another SCDS processor takes over the communication

  4. Low-level-signal data acquisition for the MFTF superconducting-magnet system

    International Nuclear Information System (INIS)

    Montoya, C.R.

    1981-01-01

    Acquisition of low level signals from sensors mounted on the superconducting yin-yang magnet in the Mirror Fusion Test Facility (MFTF) imposes very strict requirements on the magnet signal conditioning and data acquisition system. Of the various types of sensors required, thermocouples, strain gages, and voltage taps produce very low level outputs. These low level outputs must be accurately measured in the harsh environment of slowly varying magnetic fields, cryogenic temperatures, high vacuum, pulse power and 60 Hz electrical noise, possible neutron radiation, and high common mode voltage resulting from superconducting magnet quench. Successful measurements require careful attention to grounding, shielding, signal handling and processing in the data acquisition system. The magnet instrumentation system provides a means of effectively measuring both low level signals and high level signals from all types of sensors

  5. Low level signal data acquisition for the MFTF-B superconducting magnet system

    International Nuclear Information System (INIS)

    Montoya, C.R.

    1984-01-01

    Acquisition of low level signals from sensors mounted on the superconducting magnets in the Tandem Mirror Fusion Test Facility (MFTF-B) impose very strict requirements on the magnet signal conditioning and data acquisition system. Of the various types of sensors required, thermocouples and strain gages produce very low level outputs. These low level outputs must be accurately measured in the harsh environment of slowly varying magnetic fields, cryogenic temperatures, high vacuum, 80 kV pulse power, 60 Hz, 17 MHz and 28, 35, and 56 GHz electrical noise and possible neutron radiation. Successful measurements require careful attention to grounding, shielding, signal handling and processing in the data acquisition system. The magnet instrumentation system provides a means of effectively measuring both low level signals and high level signals from all types of sensors. Various methods involved in the design and implementation of the system for signal conditioning and data gathering will be presented

  6. Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.

    Science.gov (United States)

    Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M

    2013-05-21

    This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.

  7. Fusion blanket testing in MFTF-α + T

    International Nuclear Information System (INIS)

    Kleefeldt, K.

    1985-01-01

    The Mirror Fusion Test Facility-α + T (MFTF-α + T) is an upgraded version of the current MFTF-B test facility at Lawrence Livermore National Laboratory, and is designed for near-term fusion-technology-integrated tests at a neutron flux of 2 MW/m 2 . Currently, the fusion community is screening blanket and related issues to determine which ones can be addressed using MFTF-α + T. In this work, the minimum testing needs to address these issues are identified for the liquid-metal-cooled blanket and the solid-breeder blanket. Based on the testing needs and on the MFTF-α + T capability, a test plan is proposed for three options; each option covers a six to seven year testing phase. The options reflect the unresolved question of whether to place the research and development (R and D) emphasis on liquid-metal or solid-breeder blankets. In each case, most of the issues discussed can be addressed to a reasonable extent in MFTF-α+T

  8. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  9. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  10. Neutral-beam aiming and calorimetry for MFTF-B

    International Nuclear Information System (INIS)

    Goldner, A.I.; Margolies, D.

    1981-01-01

    The vessel for the Tandem Mirror Fusion Test Facility (MFTF-B) will have up to eleven 0.5-s-duration neutral-beam injectors for the initial heating of the MFTF-B plasma. Knowing the exact alignment of the beams and their total power is critical to the performance of the experiment. Using prototype aiming and calorimetry systems on the High Voltage Test Stand (HVTS) at Lawrence Livermore National Laboratory (LLNL), we hope to prove our ability to obtain an aiming accuracy of +-1 cm at the plasma and a calorimetric accuracy of +-5% of the actual total beam energy

  11. Liquid helium cooling of the MFTF superconducting magnets

    International Nuclear Information System (INIS)

    VanSant, J.H.; Zbasnik, J.P.

    1986-09-01

    During acceptance testing of the Mirror Fusion Test Facility (MFTF), we measured these tests: liquid helium heat loads and flow rates in selected magnets. We used the data from these tests to estimate helium vapor quality in the magnets so that we could determine if adequate conductor cooling conditions had occurred. We compared the measured quality and flow with estimates from a theoretical model developed for the MFTF magnets. The comparison is reasonably good, considering influences that can greatly affect these values. This paper describes the methods employed in making the measurements and developing the theoretical estimates. It also describes the helium system that maintained the magnets at required operating conditions

  12. Axicell MFTF-B superconducting-magnet system

    International Nuclear Information System (INIS)

    Wang, S.T.; Bulmer, R.; Hanson, C.; Hinkle, R.; Kozman, T.; Shimer, D.; Tatro, R.; VanSant, J.; Wohlwend, J.

    1982-01-01

    The Axicell MFTF-B magnet system will provide the field environment necessary for tandem mirror plasma physics investigation with thermal barriers. The performance of the device will stimulate DT to achieve energy break-even plasma conditions. Operation will be with deuterium only. There will be 24 superconducting coils consisting of 2 sets of yin-yang pairs, 14 central-cell solenoids, 2 sets of axicell mirror-coil pairs, and 2 transition coils between the axicell mirror coil-pairs and the yin-yang coils. This paper describes the progress in the design and construction of MFTF-B Superconducting-Magnet System

  13. Performance of the MFTF magnet cryogenic power leads

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1983-01-01

    The cryogenic power lead system for the MFTF superconducting magnets has been acceptance tested and operated with the magnets. This system, which includes 5-m-long superconducting buses, 1.5-m-long vapor-cooled transition leads, external warm buses, and a cryostack, can conduct up to 6000 A (dc) and operate adiabatically for long periods. We present both design details and performance data; our MFTF version is an example of a reliable lead system for large superconducting magnets contained in a much larger vacuum vessel

  14. 1000 kW ICRH amplifiers for MFTF-B

    International Nuclear Information System (INIS)

    Boksberger, U.

    1986-01-01

    For the startup of the MFTF-B ICRH heating will be applied. Two commercial amplifiers derived from standard broadcast transmitters provide 1000 kW RF power each into a matching system for any VSWR as high as 1.5. Emphasis is put on the specific environment of magnetic fields and seismic loads as well as to the particular RF power control requirements and remote operation. Also addressed is the amplifier's performance into a typical load. The load variations due to the MFTF-B plasma coupling were calculated by TRW

  15. Protection of the MFTF accel power supplies

    International Nuclear Information System (INIS)

    Wilson, J.H.; Wood, J.C.

    1979-01-01

    The MFTF experiment's Sustaining Neutral Beam Power Supply System (SNBPSS) includes twenty-four 95 kV, 80 A accel dc power supplies (ADCPS). Each power supply includes a relatively high-impedance (20 percent) rectifier transformer and a step voltage regulator with a 50-100 percent voltage range. With this combination, the fault current for some postulated faults may be lower than the supply's full load current at maximum voltage. A design has been developed which uses protective relays and current-limiting fuses coordinated to detect phase and ground faults, DC faults, incorrect voltage conditions, rectifier faults, power factor correction capacitor faults, and overloads. This unusual solution ensures fast tripping on potentially destructive high-current faults and long-time delays at lower currents to allow 30 second pulse operation. The ADCPS meets the LLL specification that all major assemblies be self-protecting, that is, able to sustain external faults without damage to minimize damage due to internal faults

  16. A Wireless Sensor Network Deployment for Rural and Forest Fire Detection and Verification

    Science.gov (United States)

    Lloret, Jaime; Garcia, Miguel; Bri, Diana; Sendra, Sandra

    2009-01-01

    Forest and rural fires are one of the main causes of environmental degradation in Mediterranean countries. Existing fire detection systems only focus on detection, but not on the verification of the fire. However, almost all of them are just simulations, and very few implementations can be found. Besides, the systems in the literature lack scalability. In this paper we show all the steps followed to perform the design, research and development of a wireless multisensor network which mixes sensors with IP cameras in a wireless network in order to detect and verify fire in rural and forest areas of Spain. We have studied how many cameras, sensors and access points are needed to cover a rural or forest area, and the scalability of the system. We have developed a multisensor and when it detects a fire, it sends a sensor alarm through the wireless network to a central server. The central server selects the closest wireless cameras to the multisensor, based on a software application, which are rotated to the sensor that raised the alarm, and sends them a message in order to receive real-time images from the zone. The camera lets the fire fighters corroborate the existence of a fire and avoid false alarms. In this paper, we show the test performance given by a test bench formed by four wireless IP cameras in several situations and the energy consumed when they are transmitting. Moreover, we study the energy consumed by each device when the system is set up. The wireless sensor network could be connected to Internet through a gateway and the images of the cameras could be seen from any part of the world. PMID:22291533

  17. A Wireless Sensor Network Deployment for Rural and Forest Fire Detection and Verification

    Directory of Open Access Journals (Sweden)

    Sandra Sendra

    2009-10-01

    Full Text Available Forest and rural fires are one of the main causes of environmental degradation in Mediterranean countries. Existing fire detection systems only focus on detection, but not on the verification of the fire. However, almost all of them are just simulations, and very few implementations can be found. Besides, the systems in the literature lack scalability. In this paper we show all the steps followed to perform the design, research and development of a wireless multisensor network which mixes sensors with IP cameras in a wireless network in order to detect and verify fire in rural and forest areas of Spain. We have studied how many cameras, sensors and access points are needed to cover a rural or forest area, and the scalability of the system. We have developed a multisensor and when it detects a fire, it sends a sensor alarm through the wireless network to a central server. The central server selects the closest wireless cameras to the multisensor, based on a software application, which are rotated to the sensor that raised the alarm, and sends them a message in order to receive real-time images from the zone. The camera lets the fire fighters corroborate the existence of a fire and avoid false alarms. In this paper, we show the test performance given by a test bench formed by four wireless IP cameras in several situations and the energy consumed when they are transmitting. Moreover, we study the energy consumed by each device when the system is set up. The wireless sensor network could be connected to Internet through a gateway and the images of the cameras could be seen from any part of the world.

  18. MFTF-B PACE tests and final cost report

    International Nuclear Information System (INIS)

    Krause, K.H.; Kozman, T.A.; Smith, J.L.; Horan, R.J.

    1986-10-01

    The Mirror Fusion Test Facility (MFTF-B) construction project was successfully completed in February 1986, with the conclusion of the Plant and Capital Equipment (PACE) Tests. This series of tests, starting in September 1985 and running through February 1986, demonstrated the overall machine capabilities and special facilities accomplishments for the Mirror Fusion Test Facility Project

  19. Axicell design for the end plugs of MFTF-B

    International Nuclear Information System (INIS)

    Thomassen, K.I.; Karpenko, V.N.

    1982-01-01

    Certain changes in the end-plug design in the Mirror Fusion Test Facility (MFTF-B) are described. The Laboratory (LLNL) proposes to implement these changes as soon as possible in order to construct the machine in an axicell configuration. The present physics and technology goals as well as the project cost and schedule will not be affected by these changes

  20. Mechanical considerations for MFTF-B plasma-diagnostic system

    International Nuclear Information System (INIS)

    Thomas, S.R. Jr.; Wells, C.W.

    1981-01-01

    The reconfiguration of MFTF to a tandem mirror machine with thermal barriers has caused a significant expansion in the physical scope of plasma diagnostics. From a mechanical perspective, it complicates the plasma access, system interfaces, growth and environmental considerations. Conceptual designs characterize the general scope of the design and fabrication which remains to be done

  1. MFTF-α+T end plug magnet design

    International Nuclear Information System (INIS)

    Srivastava, V.C.; O'Toole, J.A.

    1983-01-01

    The conceptual design of the end-plug magnets for MFTF-α+T is described. MFTF-α+ T is a near-term upgrade of MFTF-B, which features new end plugs to improve performance. The Fusion Engineering Design Center has performed the engineering design of MFTF-α+T under the overall direction of Lawrence Livermore National Laboratory. Each end plug consists of two Yin-Yang pairs, each with approx.2.5:1 mirror ratio and approx.5-T peak field on axis; two transition coils; and a recircularizing solenoid. This paper describes the end-plug magnet system functional requirements and presents a conceptual design that meets them. The peak field at the windings of the end-plug coils is approx.6-T. These coils are designed using the NbTi MFTF-B conductor and cooled by a 4.2K liquid helium bath. All the end-plug magnets are designed to operate in the cryostable mode with adequate quench protection for safety. Shielding requirements are stated and a summary of heat loads is provided. Field and force calculations are discussed. The field on axis is shown to meet the functional requirements. Force resultants are reported in terms of winding running loads and resultant coil forces are also given. The magnet structural support is described. A trade study to determine the optimum end-cell coil internal nuclear shield thickness and the resulting coil size based on minimizing the end-cell life cycle cost is summarized

  2. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  3. Control and diagnostic data structures for the MFTF

    International Nuclear Information System (INIS)

    Wade, J.A.; Choy, J.H.

    1979-01-01

    A Data Base Management System (DBMS) is being written as an integral part of the Supervisory Control and Diagnostics System (SCDS) of programs for control of the Mirror Fusion Test Facility (MFTF). The data upon which the DBMS operates consist of control values and evaluative information required for facilities control, along with control values and disgnostic data acquired as a result of each MFTF shot. The user interface to the DBMS essentially consists of two views: a computer program interface called the Program Level Interface (PLI) and a stand-alone interactive program called the Query Level Interface to support terminal-based queries. This paper deals specifically with the data structure capabilities from the viewpoint of the PLI user

  4. Progress on axicell MFTF-B superconducting magnet systems

    International Nuclear Information System (INIS)

    Wang, S.T.; Kozman, T.A.; Hanson, C.L.; Shimer, D.W.; VanSant, J.H.; Zbasnik, J.

    1983-01-01

    Since the entire Mirror Fusion Test Facility (MFTF-B) Magnet System was reconfigured from the original A-cell to an axicell design, much progress has been made on the design, fabrication, and installation planning. The axicell MFTF-B magnet array consists of a total of 26 large superconducting main coils. This paper provides an engineering overview of the progress of these coils. Recent studies on the effects of field errors on the plasma at the recircularizing region (transition coils) show that small field errors will generate large displacements of the field lines. These field errors might enhance radial electron heat transport and deteriorate the plasma confinement. Therefore, 16 superconducting trim coils have been designed to correct the coil misalignments. Progress of the trim coils are reported also

  5. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  6. Physics conceptual design for the MFTF-B transition coil

    International Nuclear Information System (INIS)

    Baldwin, D.E.; Bulmer, R.H.

    1982-01-01

    The physics constraints related to finite-β equilibria, β limits due to curvature-driven MHD modes, and ion transport in the central cell. These physics constraints had to be satisfied subject to certain non-physics constraints. Principal among these were the geometric and structural features of the existing MFTF-B magnet set and the required access for neutral beams for pumping. These constraints and their origins are discussed

  7. Manufacturing and quality assurance for the MFTF superconductor core

    International Nuclear Information System (INIS)

    Scanlan, R.M.; Johnston, J.E.; Waide, P.A.; Zeitlin, B.A.; Smith, G.B.; Nelson, C.T.

    1979-01-01

    A total of 55,000 m of multifilamentary Nb-Ti superconductor in minimum lengths of 380 m are required for the Mirror Fusion Test Facility. This conductor is a large cross-section monolith and, as such, has presented several new manufacturing challenges. In addition, a monolith requires more stringent quality assurance procedures than braids or cables. This paper describes the manufacturing steps and the quality assurance program which have been developed for the MFTF superconductor core

  8. Model approach for simulating the thermodynamic behavior of the MFTF cryogenic cooling systems - a status report

    International Nuclear Information System (INIS)

    Sutton, S.B.; Stein, W.; Reitter, T.A.; Hindmarsh, A.C.

    1983-01-01

    A numerical model for calculating the thermodynamic behavior of the MFTF-B cryogenic cooling system is described. Nine component types are discussed with governing equations given. The algorithm for solving the coupled set of algebraic and ordinary differential equations is described. The model and its application to the MFTF-B cryogenic cooling system has not been possible due to lack of funding

  9. Field-reversal experiments in the mirror fusion test facility (MFTF)

    International Nuclear Information System (INIS)

    Shearer, J.W.; Condit, W.C.

    1977-01-01

    Detailed consideration of several aspects of a field-reversal experiment was begun in the Mirror Fusion Test Facility (MFTF): Model calculations have provided some plausible parameters for a field-reversed deuterium plasma in the MFTF, and a buildup calculation indicates that the MFTF neutral-beam system is marginally sufficient to achieve field reversal by neutral injection alone. However, the many uncertainties indicate the need for further research and development on alternate buildup methods. A discussion of experimental objectives is presented and important diagnostics are listed. The range of parameter space accessible with the MFTF magnet design is explored, and we find that with proper aiming of the neutral beams, meaningful experiments can be performed to advance toward these objectives. Finally, it is pointed out that if we achieve enhanced n tau confinement by means of field reversal, then quasi-steady-state operation of MFTF is conceivable

  10. A Probabilistic Mass Estimation Algorithm for a Novel 7- Channel Capacitive Sample Verification Sensor

    Science.gov (United States)

    Wolf, Michael

    2012-01-01

    A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.

  11. Maintenance and availability considerations for MFTF-B upgrade

    International Nuclear Information System (INIS)

    Spampinato, P.T.

    1983-01-01

    The upgrade of the Mirror Fusion Test Facility (MFTF-B) tandem mirror device incorporates the operation of advanced systems plus the requirement for remote maintenance. To determine if the operating availability goal of this device is achievable, an assessment of component lifetimes was made, along with estimates of device downtime. Key subsystem components were considered from the magnet, heating, impurity control, pumping, and test module systems. Component replacements were grouped into three categories, and a lifetime operating plan, including component replacements, was developed. It was determined that this device could achieve a 10% operating availability

  12. Electrical supply for MFTF-B superconducting magnet system

    International Nuclear Information System (INIS)

    Shimer, D.W.; Owen, E.W.

    1985-01-01

    The MFTF-B magnet system consists of 42 superconducting magnets which must operate continuously for long periods of time. The magnet power supply system is designed to meet the operational requirements of accuracy, flexibility, and reliability. The superconducting magnets require a protection system to protect against critical magnet faults of quench, current lead overtemperature, and overcurrent. The protection system is complex because of the large number of magnets, the strong coupling between magnets, and the high reliability requirement. This paper describes the power circuits and the components used in the design

  13. New kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Saroyan, R.A.; Mead, J.E.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  14. A new kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.; Mead, J.; Saroyan, R.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  15. Alternatives for contaminant control during MFTF plasma buildup

    International Nuclear Information System (INIS)

    Khan, J.M.; Valby, L.E.

    1979-01-01

    The MFTF mirror device considers all low-energy species to be contaminants, since their primary effect is to erode the plasma boundary by charge-exchange reactions. Confinement for other than hydrogen isotypes is far from complete and confinement time is hardly more than transit time from the source to the end wall. The brevity of the confinement time makes it all the more necessary to prevent any contamination which might further reduce it. At Livermore, the historical solution to contaminant control has been to evaporate titanium onto cold surfaces. An alternative to this approach and its implications are considered

  16. Dynamic testing of MFTF containment-vessel structural system

    International Nuclear Information System (INIS)

    Weaver, H.J.; McCallen, D.B.; Eli, M.W.

    1982-01-01

    Dynamic (modal) testing was performed on the Magnetic Fusion Test Facility (MFTF) containment vessel. The seismic design of this vessel was heavily dependent upon the value of structural damping used in the analysis. Typically for welded steel vessels, a value of 2 to 3% of critical is used. However, due to the large mass of the vessel and magnet supported inside, we felt that the interaction between the structure and its foundation would be enhanced. This would result in a larger value of damping because vibrational energy in the structure would be transferred through the foundation into the surrounding soil. The dynamic test performed on this structure (with the magnet in place) confirmed this later theory and resulted in damping values of approximately 4 to 5% for the whole body modes. This report presents a brief description of dynamic testing emphasizing the specific test procedure used on the MFTF-A system. It also presents an interpretation of the damping mechanisms observed (material and geometric) based upon the spatial characteristics of the modal parameters

  17. User interface on networked workstations for MFTF plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Renbarger, V.L.; Balch, T.R.

    1985-01-01

    A network of Sun-2/170 workstations is used to provide an interface to the MFTF-B Plasma Diagnostics System at Lawrence Livermore National Laboratory. The Plasma Diagnostics System (PDS) is responsible for control of MFTF-B plasma diagnostic instrumentation. An EtherNet Local Area Network links the workstations to a central multiprocessing system which furnishes data processing, data storage and control services for PDS. These workstations permit a physicist to command data acquisition, data processing, instrument control, and display of results. The interface is implemented as a metaphorical desktop, which helps the operator form a mental model of how the system works. As on a real desktop, functions are provided by sheets of paper (windows on a CRT screen) called worksheets. The worksheets may be invoked by pop-up menus and may be manipulated with a mouse. These worksheets are actually tasks that communicate with other tasks running in the central computer system. By making entries in the appropriate worksheet, a physicist may specify data acquisition or processing, control a diagnostic, or view a result

  18. Options for axisymmetric operation of MFTF-B

    International Nuclear Information System (INIS)

    Fenstermacher, M.E.; Devoto, R.S.; Thomassen, K.I.

    1986-01-01

    The flexibility of MFTF-B for axisymmetric experiments has been investigated. Interhcanging the axicell coils and increasing their separation results in an axisymmetric plug cell with 12:1 and 6:1 inner and outer mirror ratios, respectively. For axisymmetric operation, the sloshing-ion neutral beams, ECRH gyrotrons, and the pumping system would be moved to the axicell. Stabilization by E-rings could be explored in this configuration. With the addition of octopole magnets, off-axis multipole stabilization could also be tested. Operating points for octopole and E-ring-stabilized configurations with properties similar to those of the quadrupole MFTF-B, namely T/sub ic/ = 10 - 15 keV and n/sub c/ approx. = 3 x 10 13 cm -3 , have been obtained. Because of the negligible radial transport of central-cell ions, the required neutral-beam power in the central cell has been dramatically reduced. In addition, because MHD stabilization is achieved by off-axis hot electrons in both cases, much lower barrier beta is possible, which aids in reducing the barrier ECRH power. Total ECRH power in the end cell is projected to be approx. =1 MW. Possible operating points for both octopole and E-ring configurations are described along with the stability considerations involved

  19. Low-Cost Planar PTF Sensors for the Identity Verification of Smartcard Holders

    NARCIS (Netherlands)

    Henderson, N.J.; Papakostas, T.V.; White, N.M.; Hartel, Pieter H.

    The properties of mechanical flexibility, low-cost and planar geometry make polymer thick film (PTF) sensors attractive for embedded smartcard biometrics. PTF piezoelectric and piezoresistive pressure sensors are investigated for their potential to capture spatial human characteristics. However, it

  20. Development of novel EMAT-ECT multi-sensor and verification of its feasibility

    International Nuclear Information System (INIS)

    Suzuki, Kenichiro; Uchimoto, Tetsuya; Takagi, Toshiyuki; Sato, Takeshi; Guy, Philippe; Casse, Amelie

    2006-01-01

    In this study, we propose a novel EMAT-ECT multi sensor aiming at advanced structural health monitoring. For the purpose, proto-type EMAT-ECT multi-sensor was developed and their functions both as ECT and EMAT prove were evaluated. Experimental results of pulse ECT using the EMAT-ECT multi-sensor showed that the proposed sensor has a capability of detection and sizing of flaws. Experimental results of EMAT evaluation using the EMAT-ECT multi-sensor showed that ultrasonic wave was transmitted by EMAT-ECT multi sensor and flaw echo was observed. These results imply that EMAT-ECT multi sensor is available for pulse ECT and EMAT. (author)

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  2. Response time verification of in situ hydraulic pressure sensors in a nuclear reactor

    International Nuclear Information System (INIS)

    Foster, C.G.

    1978-01-01

    A method and apparatus for verifying response time in situ of hydraulic pressure and pressure differential sensing instrumentation in a nuclear circuit is disclosed. Hydraulic pressure at a reference sensor and at an in situ process sensor under test is varied according to a linear ramp. Sensor response time is then determined by comparison of the sensor electrical analog output signals. The process sensor is subjected to a relatively slowly changing and a relatively rapidly changing hydraulic pressure ramp signal to determine an upper bound for process sensor response time over the range of all pressure transients to which the sensor is required to respond. Signal linearity is independent of the volumetric displacement of the process sensor. The hydraulic signal generator includes a first pressurizable gas reservoir, a second pressurizable liquid and gas reservoir, a gate for rapidly opening a gas communication path between the two reservoirs, a throttle valve for regulating rate of gas pressure equalization between the two reservoirs, and hydraulic conduit means for simultaneously communicating a ramp of hydraulic pressure change between the liquid/gas reservoir and both a reference and a process sensor. By maintaining a sufficient pressure differential between the reservoirs and by maintaining a sufficient ratio of gas to liquid in the liquid/gas reservoir, excellent linearity and minimal transient effects can be achieved for all pressure ranges, magnitudes, and rates of change of interest

  3. Design and fabrication of the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Tatro, R.E.; Kozman, T.A.

    1985-09-01

    The MFTF-B superconducting magnet system consists of 40 NbTi magnets and two Nb 3 Sn magnets. General Dynamics (GD) designed all magnets except for the small trim coils. GD then fabricated 20 NbTi magnets, while LLNL fabricated 20 NbTi magnets and two Nb 3 Sn magnets. The design phase was completed in February 1984 and included the competitive procurement of magnet structural fabrication, superconductor, G-10CR insulation, support struts and bearings, vapor-cooled leads, and thermal shields for all magnets. Fabrication of all magnets was completed in March 1985. At GD, dual assembly lines were necessary during fabrication in order to meet the aggressive LLNL schedule. The entire magnet system has been installed and aligned at LLNL, and Tech Demo tests will be performed during September-November 1985

  4. Industrialization and production of neutral beam ion sources for MFTF

    International Nuclear Information System (INIS)

    Lynch, W.S.

    1981-01-01

    The existing LLNL designs of the 20 and 80kV deuterium fueled Neutral Beam Ion Source Modules (NBSM) have been industrialized and are being produced successfully for the MFTF. Industrialization includes value engineering, production engineering, cost reduction, fixturing, facilitation and procurement of components. Production assembly, inspection and testing is being performed in a large electronics manufacturing plant. Decades of experience in high voltage, high vacuum power tubes is being applied to the procedures and processes. Independent quality and reliability assurance criteria are being utilized. Scheduling of the various engineering, procurement and manufacturing task is performed by the use of a Critical Path Method (CPM) computer code, Innovative, computerized grid alignment methods were also designed and installed specifically for this project. New jointing and cleaning techniques were devised for the NBSMs. Traceability and cost control are also utilized

  5. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  6. Alternative connections for the large MFTF-B solenoids

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.; Wang, S.T.

    1983-01-01

    The MFTF-B central-cell solenoids are a set of twelve closely coupled, large superconducting magnets with similar but not exactly equal currents. Alternative methods of connecting them to their power supplies and dump resistors are investigated. The circuits are evaluated for operating conditions and fault conditions. The factors considered are the voltage to ground during a dump, short circuits, open circuits, quenches, and failure of the protection system to detect a quench. Of particular interest are the current induced in coils that remain superconducting when one or more coils quench. The alternative connections include separate power supplies, combined power supplies, individual dump resistors, series dump resistors and combinations of these. A new circuit that contains coupling resistors is proposed. The coupling resistors do not affect normal fast dumps but reduce the peak induced currents while also reducing the energy rating of the dump resistors. Another novel circuit, the series circuit with diodes, is discussed in detail

  7. Overview of MFTF supervisory control and diagnostics system software

    International Nuclear Information System (INIS)

    Ng, W.C.

    1979-01-01

    The Mirror Fusion Test Facility (MFTF) at the Lawrence Livermore Laboratory (LLL) is currently the largest mirror fusion research project in the world. Its Control and Diagnostics System is handled by a distributed computer network consisting of nine Interdata minicomputer systems and about 65 microprocessors. One of the design requirements is tolerance of single-point failure. If one of the computer systems becomes inoperative, the experiment can still be carried out, although the system responsiveness to operator command may be degraded. In a normal experiment cycle, the researcher can examine the result of the previous experiment, change any control parameter, fire a shot, collect four million bytes of diagnostics data, perform intershot analysis, and have the result presented - all within five minutes. The software approach adopted for the Supervisory Control and Diagnostics System features chief programmer teams and structured programming. Pascal is the standard programming language in this project

  8. Improvement in MFTF data base system response times

    International Nuclear Information System (INIS)

    Lang, N.C.; Nelson, B.C.

    1983-01-01

    The Supervisory Control and Diagnostic System for the Mirror Fusion Test Facility (MFTF) has been designed as an event driven system. To this end we have designed a data base notification facility in which a task can request that it be loaded and started whenever an element in the data base is changed beyond some user defined range. Our initial implementation of the notify facility exhibited marginal response times whenever a data base table with a large number of outstanding notifies was written into. In this paper we discuss the sources of the slow response and describe in detail a new structure for the list of notifies which minimizes search time resulting in significantly faster response

  9. MFTF 230 kV pulsed power substation

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1979-01-01

    The Mirror Fusion Test Facility (MFTF) currently under construction at the Lawrence Livermore Laboratory includes a Sustaining Neutral Beam Power Supply System (SNBPSS) consisting of 24 power-supply sets. The System will operate in long pulses (initially .5 seconds and eventually 30 seconds) at high power (200 MW), which will necessitate a large source of ac power. To meet this requirement, a new 230-kV substation is also being built at LLL. The constraints of cost, equipment protection, short operating lifetime (10 years), and reliability dictated a unique substation design. Its unusual features include provisions for fast fault detection and tripping, a capability for limiting ground fault current, low impedance, and economical design

  10. Location verification algorithm of wearable sensors for wireless body area networks.

    Science.gov (United States)

    Wang, Hua; Wen, Yingyou; Zhao, Dazhe

    2018-01-01

    Knowledge of the location of sensor devices is crucial for many medical applications of wireless body area networks, as wearable sensors are designed to monitor vital signs of a patient while the wearer still has the freedom of movement. However, clinicians or patients can misplace the wearable sensors, thereby causing a mismatch between their physical locations and their correct target positions. An error of more than a few centimeters raises the risk of mistreating patients. The present study aims to develop a scheme to calculate and detect the position of wearable sensors without beacon nodes. A new scheme was proposed to verify the location of wearable sensors mounted on the patient's body by inferring differences in atmospheric air pressure and received signal strength indication measurements from wearable sensors. Extensive two-sample t tests were performed to validate the proposed scheme. The proposed scheme could easily recognize a 30-cm horizontal body range and a 65-cm vertical body range to correctly perform sensor localization and limb identification. All experiments indicate that the scheme is suitable for identifying wearable sensor positions in an indoor environment.

  11. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  12. Experimental study on performance verification tests for coordinate measuring systems with optical distance sensors

    Science.gov (United States)

    Carmignato, Simone

    2009-01-01

    Optical sensors are increasingly used for dimensional and geometrical metrology. However, the lack of international standards for testing optical coordinate measuring systems is currently limiting the traceability of measurements and the easy comparison of different optical systems. This paper presents an experimental investigation on artefacts and procedures for testing coordinate measuring systems equipped with optical distance sensors. The work is aimed at contributing to the standardization of testing methods. The VDI/VDE 2617-6.2:2005 guideline, which is probably the most complete document available at the state of the art for testing systems with optical distance sensors, is examined with specific experiments. Results from the experiments are discussed, with particular reference to the tests used for determining the following characteristics: error of indication for size measurement, probing error and structural resolution. Particular attention is given to the use of artefacts alternative to gauge blocks for determining the error of indication for size measurement.

  13. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Design and analysis summary. Volume 1

    International Nuclear Information System (INIS)

    Heathman, J.H.; Wohlwend, J.W.

    1985-05-01

    This report summarizes the designs and analyses produced by General Dynamics Convair for the four Axicell magnets (A1 and A20, east and west), the four Transition magnets (T1 and T2, east and west), and the twelve Solenoid magnets (S1 through S6, east and west). Over four million drawings and specifications, in addition to detailed stress analysis, thermal analysis, electrical, instrumentation, and verification test reports were produced as part of the MFTF-B design effort. Significant aspects of the designs, as well as key analysis results, are summarized in this report. In addition, drawing trees and lists off detailed analysis and test reports included in this report define the locations of the detailed design and analysis data

  14. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Design and analysis summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Heathman, J.H.; Wohlwend, J.W.

    1985-05-01

    This report summarizes the designs and analyses produced by General Dynamics Convair for the four Axicell magnets (A1 and A20, east and west), the four Transition magnets (T1 and T2, east and west), and the twelve Solenoid magnets (S1 through S6, east and west). Over four million drawings and specifications, in addition to detailed stress analysis, thermal analysis, electrical, instrumentation, and verification test reports were produced as part of the MFTF-B design effort. Significant aspects of the designs, as well as key analysis results, are summarized in this report. In addition, drawing trees and lists off detailed analysis and test reports included in this report define the locations of the detailed design and analysis data.

  15. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  16. Optimization and Verification of the TR-MAC Protocol for Wireless Sensor Networks

    NARCIS (Netherlands)

    Morshed, S.; Heijenk, Geert

    2015-01-01

    Energy-efficiency is an important requirement in the design of communication protocols for wireless sensor networks (WSN). TR-MAC is an energy-efficient medium access control (MAC) layer protocol for low power WSN that exploits transmitted-reference (TR) modulation in the physical layer. The

  17. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.

    2017-03-01

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  18. Structural analysis interpretation task for the magnet system for Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Baldi, R.W.

    1979-11-01

    The primary objective of this study was to develop recommendations to improve and substantiate the structural integrity of the highly stresses small radius region of the MFTF magnet. The specific approach is outlined: (1) Extract detail stress/strain data from General Dynamics Convair Finite-Element Refinement Analysis. (2) Diagram local plate stress distribution and its relationship to the adjacent weldment. (3) Update the parametric fracture mechanics analysis using most recent MFTF related data developed by National Bureau of Standards. (4) Review sequence and assembly as modified by Chicago Bridge and Iron for adaptability to refinements. (5) Investigate the need for fillet radii weldments to reduce stress concentrations at critical corners. (6) Review quality assurance plan for adequacy to insure structural quality in the small radius region. (7) Review instrumentation plan for adequacy of structural diagnostics in small radius region. (8) Participate in planning a small-scale fatigue test program of a typical MFTF weldment

  19. Physics basis for an axicell design for the end plugs of MFTF-B

    International Nuclear Information System (INIS)

    Baldwin, D.E.; Logan, B.G.

    1982-01-01

    The primary motivation for conversion of MFTF-B to an axicell configuration lies in its engineering promise as a reactor geometry based on circular high-magnetic-field coils. In comparing this configuration to the previous A-cell geometry, we find a number of differences that might significantly affect the physics performance. The purpose of the present document is to examine those features and to assess their impact on the performance of the axicell, as compared to the A-cell configuration, for MFTF-B. In so doing, we address only those issues thought to be affected by the change in geometry and refer to the original report Physics Basis for MFTF-B, for discussion of those issues thought not to be affected. In Sec. 1, we summarize these physics issues. In Sec. 2, we describe operating scenarios in the new configuration. In the Appendices, we discuss those physics issues that require more detailed treatment

  20. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  1. Structural design considerations in the Mirror Fusion Test Facility (MFTF-B) vacuum vessel

    International Nuclear Information System (INIS)

    Vepa, K.; Sterbentz, W.H.

    1981-01-01

    In view of favorable results from the Tandem Mirror Experiment (TMX) also at LLNL, the MFTF project is now being rescoped into a large tandem mirror configuration (MFTF-B), which is the mainline approach to a mirror fusion reactor. This paper concerns itself with the structural aspects of the design of the vessel. The vessel and its intended functions are described. The major structural design issues, especially those influenced by the analysis, are described. The objectives of the finite element analysis and their realization are discussed at length

  2. Review of MFTF yin-yang magnet displacement and magnetic field measurements and calculations

    International Nuclear Information System (INIS)

    Hanson, C.L.; Myall, J.O.; Wohlwend, J.W.

    1983-01-01

    During the recent testing of the MFTF yin-yang magnet, measurements of coil position, structural case strain, and magnetic field were made to verify calculated values. Measurements to detect magnet movement were taken throughout cooldown and during the operation of the magnet. The magnetic field at the mirror points was measured by Hall-effect probes. The magnet position, structural case strain, and magnetic field measurements indicated a reasonably close correlation with calculated values. Information obtained from the yin-yang test has been very useful in setting realistic mechanical alignment values for the new MFTF-B magnet system

  3. Review of MFTF yin-yang magnet displacement and magnetic field measurements and calculations

    International Nuclear Information System (INIS)

    Hanson, C.L.; Myall, J.O.; Wohlwend, J.W.

    1983-01-01

    During the recent testing of the MFTF yin-yang magnet, measurements of coil position, structural case strain, and magnetic field were made to verify calculated values. Measurements to detect magnet movement were taken throughout cooldown and during the operation of the magnet. The magnetic field at the mirror points was measured by Hall-effect probes. The magnet position, structural case strain, and magntic field measurements indicated a reasonably close correlation with calculated values. Information obtained from the yin-yang test has been very useful in setting realistic mechanical alignment values for the new MFTF-B magnet system

  4. Sparking protection for MFTF-B Neutral Beam Power Supplies

    International Nuclear Information System (INIS)

    Cummings, D.B.

    1983-01-01

    This paper describes the upgrade of MFTF-B Neutral Beam Power Supplies for sparking protection. High performance ion sources spark repeatedly so ion source power supplies must be insensitive to sparking. The hot deck houses the series tetrode, arc and filament supplies, and controls. Hot deck shielding has been upgraded and a continuous shield around the arc, filament, gradient grid, and control cables now extends from the hot deck, through the core snubber, to the source. The shield carries accelerating current and connects only to the source. Shielded source cables go through an outer duct which now connects to a ground plane under the hot deck. This hybrid transmission line is a low inductance path for sparks discharging the stray capacitance of the hot deck and isolation transformers, reducing coupling to building steel. Parallel DC current return cables inside the duct lower inductance to reduce inductive turn-off transients. MOVs to ground further limit surges in the remote power supply return. Single point grounding is at the source. No control or rectifier components have been damaged nor are there any known malfunctions due to sparking up to 80 kV output

  5. MFTF-B quasi-optical ECRH transmission system

    International Nuclear Information System (INIS)

    Yugo, J.J.; Shearer, J.W.; Ziolkowski, R.W.

    1983-01-01

    The microwave transmission system for ERCH on MFTF-B will utilize quasi-optical transmission techniques. The system consists of ten gyrotron oscillators: two gyrotrons at 28 GHz, two at 35 GHz, and six at 56 GHz. The 28 and 35 GHz gyrotrons both heat the electrons in the end plug (potential peak) while the 56 GHz sources heat the minimum-B anchor region (potential minimum). Microwaves are launched into a pair of cylindrical mirrors that form a pseudo-cavity which directs the microwaves through the plasma numerous times before they are lost out of the cavity. The cavity allows the microwave beam to reach the resonance zone over a wide range of plasma densities and temperatures. The fundamental electron cyclotron resonance moves to higher axial positions as a result of beta-depression of the magnetic field, doppler shifting of the resonance, and relativistic mass corrections for the electrons. With this system the microwave beam will reach the resonance surface at the correct angle of incidence for any density or temperature without active aiming of the antennas. The cavity also allows the beam to make multiple passes through the plasma to increase the heating efficiency at low temperatures and densities when the single pass absorption is low. In addition, neutral beams and diagnostics have an unobstructed view of the plasma

  6. Sparking protection for MFTF-B neutral beam power supplies

    International Nuclear Information System (INIS)

    Cummings, D.B.

    1983-01-01

    This paper describes the upgrade of MFTF-B Neutral Beam Power Supplies for sparking protection. High performance ion sources spark repeatedly so ion source power supplies must be insensitive to sparking. The hot deck houses the series tetrode, arc and filament supplies, and controls. Hot deck shielding has been upgraded and a continuous shield around the arc, filament, gradient grid, and control cables now extends from the hot deck, through the core snubber, to the source. The shield carries accelerating current and connects only to the source. Shielded source cables go through an outer duct which now connects to a ground plane under the hot deck. This hybrid transmission line is a low inductance path for sparks discharging the stray capacitance of the hot deck and isolation transformers, reducing coupling to building steel. Parallel dc current return cables inside the duct lower inductance to reduce inductive turn-off transients. MOVs to ground further limit surges in the remote power supply return. Single point grounding is at the source. No control or rectifier components have been damaged nor are there any known malfunctions due to sparking up to 80 kV output

  7. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  8. Optimal placement of excitations and sensors for verification of large dynamical systems

    Science.gov (United States)

    Salama, M.; Rose, T.; Garba, J.

    1987-01-01

    The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

  9. Directions for possible upgrades of the Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Damm, C.C.; Coensgen, F.H.; Devoto, R.S.; Molvik, A.W.; Porter, G.D.; Shearer, J.W.; Stallard, B.W.

    1977-01-01

    The Mirror Fusion Test Facility (MFTF) may be upgraded by extending the time of plasma sustenance in an approach to steady-state operation and/or by increasing the neutral-beam injection energy. Some parameter bounds for these upgrades are discussed as they relate to a definition of the required neutral-beam development

  10. Confirmatory analysis and detail design of the magnet system for mirror fusion test facility (MFTF)

    International Nuclear Information System (INIS)

    Tatro, R.E.; Baldi, R.W.

    1978-10-01

    This summary covers the six individual reports delivered to the LLL MFTF program staff. They are: (1) literature survey (helium heat transfer), (2) thermodynamic analysis, (3) structural analysis, (4) manufacturing/producibility study, (5) instrumentation plan and (6) quality assurance report

  11. Report on the engineering test of the LBL 30 second neutral beam source for the MFTF-B project

    International Nuclear Information System (INIS)

    Vella, M.C.; Pincosy, P.A.; Hauck, C.A.; Pyle, R.V.

    1984-08-01

    Positive ion based neutral beam development in the US has centered on the long pulse, Advanced Positive Ion Source (APIS). APIS eventually focused on development of 30 second sources for MFTF-B. The Engineering Test was part of competitive testing of the LBL and ORNL long pulse sources carried out for the MFTF-B Project. The test consisted of 500 beam shots with 80 kV, 30 second deuterium, and was carried out on the Neutral Beam Engineering Test Facility (NBETF). This report summarizes the results of LBL testing, in which the LBL APIS demonstrated that it would meet the requirements for MFTF-B 30 second sources. In part as a result of this test, the LBL design was found to be suitable as the baseline for a Common Long Pulse Source design for MFTF-B, TFTR, and Doublet Upgrade

  12. Tropospheric Airborne Meteorological Data Reporting (TAMDAR) Sensor Validation and Verification on National Oceanographic and Atmospheric Administration (NOAA) Lockheed WP-3D Aircraft

    Science.gov (United States)

    Tsoucalas, George; Daniels, Taumi S.; Zysko, Jan; Anderson, Mark V.; Mulally, Daniel J.

    2010-01-01

    As part of the National Aeronautics and Space Administration's Aviation Safety and Security Program, the Tropospheric Airborne Meteorological Data Reporting project (TAMDAR) developed a low-cost sensor for aircraft flying in the lower troposphere. This activity was a joint effort with support from Federal Aviation Administration, National Oceanic and Atmospheric Administration, and industry. This paper reports the TAMDAR sensor performance validation and verification, as flown on board NOAA Lockheed WP-3D aircraft. These flight tests were conducted to assess the performance of the TAMDAR sensor for measurements of temperature, relative humidity, and wind parameters. The ultimate goal was to develop a small low-cost sensor, collect useful meteorological data, downlink the data in near real time, and use the data to improve weather forecasts. The envisioned system will initially be used on regional and package carrier aircraft. The ultimate users of the data are National Centers for Environmental Prediction forecast modelers. Other users include air traffic controllers, flight service stations, and airline weather centers. NASA worked with an industry partner to develop the sensor. Prototype sensors were subjected to numerous tests in ground and flight facilities. As a result of these earlier tests, many design improvements were made to the sensor. The results of tests on a final version of the sensor are the subject of this report. The sensor is capable of measuring temperature, relative humidity, pressure, and icing. It can compute pressure altitude, indicated air speed, true air speed, ice presence, wind speed and direction, and eddy dissipation rate. Summary results from the flight test are presented along with corroborative data from aircraft instruments.

  13. Results of studies performed on the model of the MFTF Supervisory Control and Diagnostics System (SCDS)

    International Nuclear Information System (INIS)

    Wyman, R.H.

    1979-01-01

    The design and implementation of the SCDS is a relatively complex problem involving a nine-computer network coupled with a unique color graphics control console system, 50 local control minicomputers, and the usual array of drives, printers, magnetic tapes, etc. Four million bytes of data are to be collected on each MFTF cycle with a repetition rate of five minutes per shot, and the associated data processing and storing load is a major concern. Crude paper studies were made initially to try to size the various components of the system and various configurations were proposed and analyzed prior to the solicitation for the computer system. However, once the hardware was purchased and a preliminary software design was completed, it became essential and feasible to do an analysis of the system to considerably greater depth in order to identify bottlenecks and other system problems and to verify those parts of the design that met the MFTF requirements

  14. Quench Detection and Magnet Protection Study for MFTF. LLL final review

    International Nuclear Information System (INIS)

    1979-06-01

    The results of a Quench Detection and Magnet Protection Study for MFTF are summarized. The study was directed toward establishing requirements and guidelines for the electronic package used to protect the MFTF superconducting magnets. Two quench detection schemes were analyzed in detail, both of which require a programmable quench detector. Hardware and software recommendations for the quench detector were presented as well as criteria for dumping the magnet energy in the event of a quench. Overall magnet protection requirements were outlined in a detailed Failure Mode Effects and Criticality analysis, (FMECA). Hardware and software packages compatible with the FMECA were recommended, with the hardware consisting of flexible, dedicated intelligent modules specifically designed for magnet protection

  15. Report on the experience with the Supervisory Control and Diagnostics System (SCDS) of MFTF-B

    International Nuclear Information System (INIS)

    Wyman, R.H.

    1983-01-01

    The Supervisory Control and Diagnostics System (SCDS) of MFTF is a multiprocessor computer system using graphics oriented displays with touch sensitive panels as the primary operator interface. Late in the calendar year 1981 the system was used to control an integrated test of the vacuum vessel, vacuum system, cryogenics system and the superconducting magnet of MFTF. Since the completion of those tests and starting in early calendar 1983 the system has been used for control of the neutral beam test facility at LLNL. This paper presents a short overview of SCDS for the purpose of orientation and then proceeds to describe the difficulties encountered in these preliminary encounters with reality. The band-aids used to hold things together as disaster threatened as well as the long-term solutions to the problems will be discussed. Finally, we will present some comments on system costs and management philosophy

  16. Design features of the solenoid magnets for the central cell of the MFTF-B

    International Nuclear Information System (INIS)

    Wohlwend, J.W.; Tatro, R.E.; Ring, D.S.

    1981-01-01

    The 14 superconducting solenoid magnets which form the central cell of the MFTF-B are being designed and fabricated by General Dynamics for the Lawrence Livermore National Laboratory. Each solenoid coil has a mean diameter of five meters and contains 600 turns of a proven conductor type. Structural loading resulting from credible fault events, cooldown and warmup requirements, and manufacturing processes consistent with other MFTF-B magnets have been considered in the selection of 304 LN as the structural material for the magnet. The solenoid magnets are connected by 24 intercoil beams and 20 solid struts which resist the longitudinal seismic and electromagnetic attractive forces and by 24 hanger/side supports which react magnet dead weight and seismic loads. A modular arrangement of two solenoid coils within a vacuum vessel segment allow for sequential checkout and installation

  17. Startup experience with the MFTF-B ECRH 100 kV dc power supply

    International Nuclear Information System (INIS)

    Bishop, S.R.; Goodman, R.A.; Wilson, J.H.

    1983-01-01

    One of the 24 Accel dc Power Supplies (ADCPS) originally intended for the Mirror Fusion Test Facility (MFTF-B) Neutral Beam Power Supply (NBPS) System has been converted to provide negative polarity output at 90 kV with a load current of 64 A dc. The load duty cycle is a pulse of 30-seconds duration with a pulse repetition period of five minutes. A new control system has been built which will serve as a prototype for the MFTF-B ADCPS controls, and a test setup was built which will be used to test the ADCPS. The Electron Cyclotron Resonance Heating (ECRH) dc Power Supply (DCPS) has been tested under both no-load and dummy-load conditions, under remote control, without notable problems. Test results indicate that the power supply should be reliable and safe to operate, and will meet the load duty requirements

  18. Startup experience with the MFTF-B ECRH 100 kV dc power supply

    International Nuclear Information System (INIS)

    Bishop, S.R.; Goodman, R.A.; Wilson, J.H.

    1983-01-01

    One of the 24 Accel DC Power Supplies (ADCPS) originally intended for the Mirror Fusion Test Facility (MFTF-B) Neutral Beam Power Supply (NBPS) System has been converted to provide negative polarity output at 90 kV with a load current of 64 A dc. The load duty cycle is a pulse of 30-seconds duration with a pulse repetition period of five minutes. A new control system has been built which will serve as a prototype for the MFTF-B ADCPS controls, and a test setup was built which will be used to test the ADCPS. The Electron Cyclotron Resonance Heating (ECRH) DC Power Supply (DCPS) has been tested under both no-load and dummy-load conditions, under remote control, without notable problems. Test results indicate that the power supply should be reliable and safe to operate, and will meet the load duty requirements

  19. Ion trajectories of the MFTF unshielded 80-keV neutral-beam sources

    International Nuclear Information System (INIS)

    Ling, R.C.; Bulmer, R.H.; Cutler, T.A.; Foote, J.H.; Horvath, J.A.

    1978-01-01

    The trajectories of ions from the Magnetic Fusion Test Facility (MFTF) 80-keV neutral-beam sources are calculated to obtain a preliminary understanding of the ion-beam paths and the magnitude of the power densities. This information will be needed for locating and designing thermal (kinetic-energy) absorbers for the ions. The calculations are made by employing a number of previously written computer codes. The TIBRO code is used to calculate the trajectories of the ions in the fringe magnetic field of the MFTF machine, which can operate with a center-field intensity of up to 2 T. The SAMPP code gives three-dimensional views of the ion beams for better visualization of the ion-beam paths. Also used are the codes MIG, XPICK, and MERGE, which were all previously written for manipulating data

  20. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  1. Estimation of neutral-beam-induced field reversal in MFTF by an approximate scaling law

    International Nuclear Information System (INIS)

    Shearer, J.W.

    1980-01-01

    Scaling rules are derived for field-reversed plasmas whose dimensions are common multiples of the ion gyroradius in the vacuum field. These rules are then applied to the tandem MFTF configuration, and it is shown that field reversal appears to be possible for neutral beam currents of the order of 150 amperes, provided that the electron temperature is at least 500 eV

  2. Computer circuit analysis of induced currents in the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Magnuson, G.D.; Woods, E.L.

    1981-01-01

    An analysis was made of the induced current behavior of the MFTF-B magnet system. Although the magnet system consists of 22 coils, because of its symmetry we considered only 11 coils in the analysis. Various combinations of the coils were dumped either singly or in groups, with the current behavior in all magnets calculated as a function of time after initiation of the dump

  3. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  4. A Vehicular Mobile Standard Instrument for Field Verification of Traffic Speed Meters Based on Dual-Antenna Doppler Radar Sensor.

    Science.gov (United States)

    Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue

    2018-04-05

    Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.

  5. D-T axicell magnet system for MFTF-α+T

    International Nuclear Information System (INIS)

    Srivastava, V.C.

    1983-01-01

    The configuration and design of the deuterium-tritium (D-T) axicell superconducting magnets for the Mirror Fusion Test Facility (MFTF-α+T) are described. The MFTF-α+T is an upgrade of the MFTF-B, with new end-plug magnets and a neutron-producing central D-T axicell section. The 4-m long axicell - its length defined by the 12-T peaks in the mirror field - is beam fueled and heated by two beam lines, each with four neutral beam injection ports. Two large superconducting coils (means diameter approx. 3.8 m) located at Z = +-2.40 m, in conjunction with a small copper coil located outside the test volume region, produce the 4.5-T mirror midplane field. This background field is augmented by two copper coils to create the 12-T peak mirror fields at Z = +-2 m. The central region of the axicell accommodates a 1-m-long, replaceable blanket test module. The length (4 m) of the axicell was chosen to provide relatively uniform neutron wall loading over the test module. In many respects, this axicell is less than full scale, but it could be viewed as a short section of a reactor, complete with the support systems and technologies associated with a mirror reactor. The peak field at the superconducting coils is 10.8 T. The coils employ hybrid superconducting winding - Nb 3 Sn conductor in the 8- to 12-T region and NbTi in the 0- to 8-T region. The winding is cryostable and is cooled by a 4.2 K liquid helium bath. The conductor design, the winding design, and the performance analyses for these superconducting coils are described

  6. Overcoming urban GPS navigation challenges through the use of MEMS inertial sensors and proper verification of navigation system performance

    Science.gov (United States)

    Vinande, Eric T.

    This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.

  7. The local area network for the plasma Diagnostics System of MFTF-B

    International Nuclear Information System (INIS)

    Lau, N.H.; Minor, E.G.

    1983-01-01

    The MFTF-B Plasma Diagnostics System will be implemented in stages, beginning with a start-up set of diagnostics and evolving toward a basic set. The start-up set contains 12 diagnostics which will acquire a total of about 800 Kbytes of data per machine pulse; the basic set contains 23 diagnostics which will acquire a total of about 8 Mbytes of data per pulse. Each diagnostic is controlled by a ''Foundation System'' consisting of a DEC LSI-11/23 microcomputer connected to CAMAC via a 5 Mbits/second serial fiber-optic link and connected to a supervisory computer (Perkin-Elmer 3250) via a 9600 baud RS232 link. The Foundation System is a building block used throughout MFTF-B for control and status monitoring. However, its 9600 baud link to the supervisor presents a bottleneck for the large data transfers required by diagnostics. To overcome this bottleneck the diagnostics Foundation Systems will be connected together with an additional LSI-11/23 called the ''master'' to form a Local Area Network (LAN) for data acquisition. The Diagnostics LAN has a ring architecture with token passing arbitration

  8. Overview of the data acquisition and control system for plasma diagnostics on MFTF-B

    International Nuclear Information System (INIS)

    Wyman, R.H.; Deadrick, F.J.; Lau, N.H.; Nelson, B.C.; Preckshot, G.G.; Throop, A.L.

    1983-01-01

    For MFTF-B, the plasma diagnostics system is expected to grow from a collection of 12 types of diagnostic instruments, initially producing about 1 Megabyte of data per shot, to an expanded set of 22 diagnostics producing about 8 Megabytes of data per shot. To control these diagnostics and acquire and process the data, a system design has been developed which uses an architecture similar to the supervisory/local-control computer system which is used to control other MFTF-B subsystems. This paper presents an overview of the hardware and software that will control and acquire data from the plasma diagnostics system. Data flow paths from the instruments, through processing, and into final archived storage will be described. A discussion of anticipated data rates, including anticipated software overhead at various points of the system, is included, along with the identification of possible bottlenecks. A methodology for processing of the data is described, along with the approach to handle the planned growth in the diagnostic system. Motivations are presented for various design choices which have been made

  9. A user interface on networked workstations for MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Balch, T.R.; Renbarger, V.L.

    1986-01-01

    A network of Sun-2/170 workstations is used to provide an interface to the MFTF-B Plasma Diagnostics System at Lawrence Livermore National Laboratory. The Plasma Diagnostics System (PDS) is responsible for control of MFTF-B plasma diagnostic instrumentation. An EtherNet Local Area Network links the workstations to a central multiprocessing system which furnishes data processing, data storage and control services for PDS. These workstations permit a physicist to command data acquisition, data processing, instrument control, and display of results. The interface is implemented as a metaphorical desktop, which helps the operator form a mental model of how the system works. As on a real desktop, functions are provided by sheets of paper (windows on a CRT screen) called worksheets. The worksheets may be invoked by pop-up menus and may be manipulated with a mouse. These worksheets are actually tasks that communicate with other tasks running in the central computer system. By making entries in the appropriate worksheet, a physicist may specify data acquisition or processing, control a diagnostic, or view a result

  10. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.L.; Kobayashi, A.

    1986-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. This paper describes the details of the spreadsheets and the implementation experience

  11. Local area network for the plasma diagnostics system of MFTF-B

    International Nuclear Information System (INIS)

    Lau, N.H.; Minor, E.G.

    1983-01-01

    The MFTF-B Plasma Diagnostics System will be implemented in stages, beginning with a start-up set of diagnostics and evolving toward a basic set. The start-up set contains 12 diagnostics which will acquire a total of about 800 Kbytes of data per machine pulse; the basic set contains 23 diagnostics which will acquire a total of about 8 Mbytes of data per pulse. Each diagnostic is controlled by a Foundation System consisting of a DEC LSI-11/23 microcomputer connected to CAMAC via a 5 Mbits/second serial fiber-optic link and connected to a supervisory computer (Perkin-Elmer 3250) via a 9600 baud RS232 link. The Foundation System is a building block used throughout MFTF-B for control and status monitoring. However, its 9600 baud link to the supervisor presents a bottleneck for the large data transfers required by diagnostics. To overcome this bottleneck the diagnostics Foundation Systems will be connected together with an additional LSI-11/23 called the master to form a Local Area Network (LAN) for data acquisition

  12. Design features of the A-cell and transition coils of MFTF-B

    International Nuclear Information System (INIS)

    Tatro, R.E.; Wohlwend, J.W.; Ring, D.S.

    1981-01-01

    The MFTF-B transition coil and A-cell magnet designs use variations of the copper-stabilized NbTi conductor developed by LLNL for the MFTF Yin-Yang magnets. This conductor will be wound on the one inch thick (12.7 mm) stainless steel coil forms using a two-axis winding machine similar to the existing LLNL Yin-Yang winding machine. After winding, covers will be placed over the coil and welded to the coil form to form a helium-tight jacket around the conductor. These jacketed coils are then enclosed in thick structural cases that react the large Lorentz forces on the magnets. The space between the coil jacket and case will be filled by a stainless steel bladder that will be injected with urethane. The injection bladder will provide cooling passages during cooldown as well as transmitting the Lorentz forces between the jacket and the case. The large self-equilibrating lobe-spreading forces on the magnets (29.10 6 lb, 127.0 MN) for the A-cell are reacted primarily through the thick 304 LN case into the external superstructure. The net Lorentz forces and the inertial forces on the magnet are reacted through support systems into the LLNL vacuum vessel structure

  13. Design of a magnetic field alignment diagnostic for the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Deadrick, F.J.; House, P.A.; Frye, R.W.

    1985-01-01

    Magnet alignment in tandem mirror fusion machines plays a crucial role in achieving and maintaining plasma confinement. Various visual alignment tools have been described by Post et al. to align the Tara magnet system. We have designed and installed a remotely operated magnetic field alignment (MFA) diagnostic system as a part of the Mirror Fusion Test Facility (MFTF-B). It measures critical magnetic field alignment parameters of the MFTF-B coil set while under full-field operating conditions. The MFA diagnostic employs a pair of low-energy, electron beam guns on a remotely positionable probe to trace and map selected magnetic field lines. An array of precision electrical detector paddles locates the position of the electron beam, and thus the magnetic field line, at several critical points. The measurements provide a means to compute proper compensating currents to correct for mechanical misalignments of the magnets with auxiliary trim coils if necessary. This paper describes both the mechanical and electrical design of the MFA diagnostic hardware

  14. Plasma modeling of MFTF-B and the sensitivity to vacuum conditions

    International Nuclear Information System (INIS)

    Porter, G.D.; Rensink, M.

    1984-01-01

    The Mirror Fusion Test Facility (MFTF-B) is a large tandem mirror device currently under construction at Lawrence Livermore National Laboratory. The completed facility will consist of a large variety of components. Specifically, the vacuum vessel that houses the magnetic coils is basically a cylindrical vessel 60 m long and 11 m in diameter. The magnetics system consists of some 28 superconducting coils, each of which is located within the main vacuum vessel. Twenty of these coils are relatively simple solenoidal coils, but the remaining eight are of a more complicated design to provide an octupole component to certain regions of the magnetic field. The vacuum system is composed of a rough vacuum chain, used to evacuate the vessel from atmospheric pressure, and a high vacuum system, used to maintain good vacuum conditions during a plasma shot. High vacuum pumping is accomplished primarily by cryogenic panels cooled to 4.5 0 K. The MFTF-B coil set is shown together with typical axial profiles of magnetic field (a), electrostatic potential (b), and plasma density (c). The plasma is divided into nine regions axially, as labelled on the coil set in Figure 1. The central cell, which is completely azimuthally symmetric, contains a large volume plasma that is confined by a combination of the magnetic fields and the electrostatic potentials in the yin-yang cell

  15. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.; Kobayashi, A.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. We report here details of our spreadsheets and our implementation experience

  16. Mirror Fusion Test Facility-B (MFTF-B) axicell configuration: NbTi magnet system. Manufacturing/producibility final report. Volume 2

    International Nuclear Information System (INIS)

    Ritschel, A.J.; White, W.L.

    1985-05-01

    This Final MFTF-B Manufacturing/Producibility Report covers facilities, tooling plan, manufacturing sequence, schedule and performance, producibility, and lessons learned for the solenoid, axicell, and transition coils, as well as a deactivation plan, conclusions, references, and appendices

  17. Drift orbits in the TMX and MFTF-B tandem mirrors

    International Nuclear Information System (INIS)

    Byers, J.A.

    1982-01-01

    Drift orbits for the TMX and MFTF-B tandem-mirror designs are followed by using a long-thin expansion of the drift equations. Unexpected asymmetries in the field-line curvatures in the yin-yang end-mirror traps, caused by the transition coils between the solenoid and the yin-yang, result in an elliptical distortion of the drift surface with a/b=1.5 at most, a perhaps tolerable deviation from omnigenity. Yushmanov-trapped particles are no worse than the bulk hot particles. Finite-beta plasma fields, coupled to the asymmetric curvature, produce sizeable banana orbits with widths comparable to the plasma radius, but these orbits are possible for only a few of the particles. Details of the transition through resonance in the solenoid are shown, including the banana shapes of the drift surfaces and the disruption of the surface in the stochastic regime. The orbits in the original design for the A-cell of MFTF-B are the most extreme; in the vacuum fields they all have an extended peanut shape that finally closes only at about 3m. This shape is strongly non-omnigenous and suggests a hollow plasma-density profile. Finite-beta B vectorxnablaB drifts can help to minimize the radial extent of these orbits, but the strength of the vacuum curvatures makes omnigenity only marginally possible. Including B vectorxnablaphi drifts makes omnigenity even more unlikely for the ions, for which the B vectorxnablaB and B vectorxnablaphi drifts are of opposite sign, and conversely helps to omnigenize the drift surfaces of the ECRH 200-keV electrons. It is argued that not every class of particles can have good, i.e. near-omnigenous drifts, regardless of the ability of phi(r) to adjust to limit the radial extent of the orbits. This lack of omnigenity leaves one with no theoretical base for describing the MHD equilibrium in the original designs, but a new magnetic field design for MFTF-B A-cell has apparently completely restored omnigenous orbits. (author)

  18. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume I. Organization plan

    International Nuclear Information System (INIS)

    1981-12-01

    This plan and the accompanying MFTF-B Integrated Operations Plan are submitted in response to UC/LLNL Purchase Order 3883801, dated July 1981. The organization plan also addresses the specific tasks and trade studies directed by the scope of work. The Integrated Operations Plan, which includes a reliability, quality assurance, and safety plan and an integrated logistics plan, comprises the burden of the report. In the first section of this volume, certain underlying assumptions and observations are discussed setting the requirements and limits for organization. Section B presents the recommended structure itself. Section C Device Availability vs Maintenance and Support Efforts and Section D Staffing Levels and Skills provide backup detail and justification. Section E is a trade study on maintenance and support by LLNL staff vs subcontract and Section F is a plan for transitioning from the construction phase into operation. A brief summary of schedules and estimated costs concludes the volume

  19. Currents and voltages in the MFTF coils during the formation of a normal zone

    International Nuclear Information System (INIS)

    Owen, E.W.

    1980-08-01

    Expressions are obtained for the currents and voltages in a pair of inductively coupled superconducting coils under two conditions: formation of a normal zone and during a change in the level of the current in one coil. A dump resistor of low resistance and a detector bridge is connected across each coil. Calculated results are given for the MFTF coils. The circuit equations during formation of a normal zone are nonlinear and time-varying, consequently, only a series solution is possible. The conditions during a change in current are more easily found. After the transient has died away, the voltages in the coil associated with the changing source are all self-inductive, while the voltages in the other coil are all mutually inductive

  20. Noise filtering algorithm for the MFTF-B computer based control system

    International Nuclear Information System (INIS)

    Minor, E.G.

    1983-01-01

    An algorithm to reduce the message traffic in the MFTF-B computer based control system is described. The algorithm filters analog inputs to the control system. Its purpose is to distinguish between changes in the inputs due to noise and changes due to significant variations in the quantity being monitored. Noise is rejected while significant changes are reported to the control system data base, thus keeping the data base updated with a minimum number of messages. The algorithm is memory efficient, requiring only four bytes of storage per analog channel, and computationally simple, requiring only subtraction and comparison. Quantitative analysis of the algorithm is presented for the case of additive Gaussian noise. It is shown that the algorithm is stable and tends toward the mean value of the monitored variable over a wide variety of additive noise distributions

  1. Design lessons from using programmable controllers in the MFTF-B personnel safety and interlocks system

    International Nuclear Information System (INIS)

    Branum, J.D.

    1983-01-01

    Applying programmable controllers in critical applications such as personnel safety and interlocks systems requires special considerations in the design of both hardware and software. All modern programmable controller systems feature extensive internal diagnostic capabilities to protect against problems such as program memory errors; however most, if not all present designs lack an intrinsic capability for detecting and countering failures on the field-side of their I/O modules. Many of the most common styles of I/O modules can also introduce potentially dangerous sneak circuits, even without component failure. This paper presents the most significant lessons learned to date in the design of the MFTF-B Personnel Safety and Interlocks System, which utilizes two non-redundant programmable controllers with over 800 I/O points each. Specific problems recognized during the design process as well as those discovered during initial testing and operation are discussed along with their specific solutions in hardware and software

  2. Design and prototype results of a far-infrared interferometer for MFTF-B

    International Nuclear Information System (INIS)

    Monjes, J.A.; Throop, A.L.; Thomas, S.R.; Peebles, A.; Zu, Qin-Zin.

    1983-01-01

    A Far-Infrared (FIR) Laser Interferometer (FLI), operating at 185 μm wavelength is planned as part of the initial start-up set of plasma diagnostics for the Mirror Fusion Test Facility (MFTF-B). The FLI will consist of a heterodyne, three-chord laser interferometer which will be used initially to measure line-integrated plasma density in the high-density, center cell region of the machine. The conceptual system design and analysis has been completed. There are several unique environmental/physical constraints and performance requirements for this system which have required that technology-evaluation and prototyping experiments be completed to support the design effort and confirm the expected performance parameters. Issues which have been addressed include extensive use of long-path dielectric waveguide, coupling and control of free-space propagation of the beam, and polarization control. The results and conclusions of the design analysis and experimental measurements will be presented

  3. 12-T solenoid-design options for the MFTF-B Upgrade

    International Nuclear Information System (INIS)

    Schultz, J.H.; Diatchenko, N.

    1983-01-01

    The major options for the 12 T magnets examined here are the selection of normal, superconducting or hybrid normal/superconducting magnet systems. The tradeoffs are those between the higher initial cost of superconducting magnet system, the need for thick shielding of superconducting magnets, higher recirculating power in the normal magnets and poorly characterized reliability of lightly shielded normal magnets. The size and shielding tradeoffs among these options are illustrated. The design concepts presented here are evaluated only for the first design iteration of MFTF-B + T, mentioned above. In particular, all concepts now being considered have made topological improvements in the center cell, so that neutral beam power is no longer a strong function of choke coil size. This function was strongly favorable to the use of normal magnets over superconducting magnets and its absence will be discussed qualitatively in the cost comparisons

  4. Plasma potential formation and measurement in TMX-U and MFTF-B

    International Nuclear Information System (INIS)

    Grubb, D.P.

    1984-01-01

    Tandem mirrors control the axial variation of the plasma potential to create electrostatic plugs that improve the axial confinement of central cell ions and, in a thermal barrier tandem mirror, control the electron axial heat flow. Measurements of the spatial and temporal variations of the plasma potential are, therefore, important to the understanding of confinement in a tandem mirror. In this paper we discuss potential formation in a thermal barrier tandem mirror and examine the diagnostics and data obtained on the TMX-U device, including measurements of the thermal barrier potential profile using a diagnostic neutral beam and charged particle energy-spectroscopy. We then describe the heavy ion beam probe and other new plasma potential diagnostics that are under development for TMX-U and MFTF-B and examine problem areas where additional diagnostic development is desirable

  5. Structural analysis of the magnet system for Mirror Fusion Test Facility (MFTF). Addendum I

    International Nuclear Information System (INIS)

    Loss, K.R.; Wohlwend, J.W.

    1979-09-01

    The stress analysis refinement of the MFTF magnet system using GDSAP (General Dynamics Structural Analysis Program) and NASTRAN finite element computer models has been completed. The objective of this analysis was to calculate a more refined case and jacket stress distribution. The GDSAP model was refined in the minor radius area to yield a more detailed prediction of the stress distributions in critical areas identified by the previous analysis. Modifications in the case plate thickness (from 3.0 inches to 3.2 inches) and in the conductor pack load distribution and stiffness were included. The GDSAP model was converted to an identical NASTRAN model to determine the influence on stress results using higher order elements

  6. Sensors

    CERN Document Server

    Pigorsch, Enrico

    1997-01-01

    This is the 5th edition of the Metra Martech Directory "EUROPEAN CENTRES OF EXPERTISE - SENSORS." The entries represent a survey of European sensors development. The new edition contains 425 detailed profiles of companies and research institutions in 22 countries. This is reflected in the diversity of sensors development programmes described, from sensors for physical parameters to biosensors and intelligent sensor systems. We do not claim that all European organisations developing sensors are included, but this is a good cross section from an invited list of participants. If you see gaps or omissions, or would like your organisation to be included, please send details. The data base invites the formation of effective joint ventures by identifying and providing access to specific areas in which organisations offer collaboration. This issue is recognised to be of great importance and most entrants include details of collaboration offered and sought. We hope the directory on Sensors will help you to find the ri...

  7. Sensors

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, H. [PBI-Dansensor A/S (Denmark); Toft Soerensen, O. [Risoe National Lab., Materials Research Dept. (Denmark)

    1999-10-01

    A new type of ceramic oxygen sensors based on semiconducting oxides was developed in this project. The advantage of these sensors compared to standard ZrO{sub 2} sensors is that they do not require a reference gas and that they can be produced in small sizes. The sensor design and the techniques developed for production of these sensors are judged suitable by the participating industry for a niche production of a new generation of oxygen sensors. Materials research on new oxygen ion conducting conductors both for applications in oxygen sensors and in fuel was also performed in this project and finally a new process was developed for fabrication of ceramic tubes by dip-coating. (EHS)

  8. Design and fabrication of the superconducting-magnet system for the Mirror Fusion Test Facility (MFTF-B)

    International Nuclear Information System (INIS)

    Tatro, R.E.; Wohlwend, J.W.; Kozman, T.A.

    1982-01-01

    The superconducting magnet system for the Mirror Fusion Test Facility (MFTF-B) consists of 24 magnets; i.e. two pairs of C-shaped Yin-Yang coils, four C-shaped transition coils, four solenoidal axicell coils, and a 12-solenoid central cell. General Dynamics Convair Division has designed all the coils and is responsible for fabricating 20 coils. The two Yin-Yang pairs (four coils) are being fabricated by the Lawrence Livermore National Laboratory. Since MFTF-B is not a magnet development program, but rather a major physics experiment critical to the mirror fusion program, the basic philosophy has been to use proven materials and analytical techniques wherever possible. The transition and axicell coils are currently being analyzed and designed, while fabrication is under way on the solenoid magnets

  9. Ion cyclotron resonance heating (ICRH) start-up antenna for the mirror fusion test facility (MFTF-B)

    International Nuclear Information System (INIS)

    McCarville, T.M.; Romesser, T.E.

    1985-01-01

    The purpose of the ICRH start-up antenna on MFTF-B is to heat the plasma and control the ion distribution as the density increases during start-up. The antenna, consisting of two center fed half turn loops phased 180 0 apart, has been designed for 1 MW of input power, with a goal of coupling 400 kW into the ions. To vary the heating frequency relative to the local ion cyclotron frequency, the antenna is tunable over a range from 7.5 to 12.5 MHz. The thermal requirements common to low duty cycle ICRH antennas are especially severe for the MFTF-B antenna. The stress requirements are also unique, deriving from the possibility of seismic activity or JxB forces if the magnets unexpectedly quench. Considerable attention has been paid to contact control at high current bolt-up joints, and arranging geometries so as to minimize the possibility of voltage breakdown

  10. A computer model of the MFTF-B neutral beam accel dc power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1983-01-01

    Using the SCEPTRE circuit modeling code, a computer model was developed for the MFTF Neutral Beam Power Supply System (NBPSS) Accel DC Power Supply (ADCPS). The ADCPS provides 90 kV, 88 A, to the Accel Modulator. Because of the complex behavior of the power supply, use of the computer model is necessary to adequately understand the power supply's behavior over a wide range of load conditions and faults. The model developed includes all the circuit components and parameters, and some of the stray values. The model has been well validated for transients with times on the order of milliseconds, and with one exception, for steady-state operation. When using a circuit modeling code for a system with a wide range of time constants, it can become impossible to obtain good solutions for all time ranges at once. The present model concentrates on the millisecond-range transients because the compensating capacitor bank tends to isolate the power supply from the load for faster transients. Attempts to include stray circuit elements with time constants in the microsecond and shorter range have had little success because of hugh increases in computing time that result. The model has been successfully extended to include the accel modulator

  11. Safety procedures for the MFTF sustaining-neutral-beam power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1981-01-01

    The MFTF SNBPSS comprises a number of sources of potentially hazardous electrical energy in a small physical area. Power is handled at 80 kV dc, 80 A; 70 V dc, 4000 A; 25 V dc, 5500 A; 3 kV dc, 10 A; and 2 kV dc, 10 A. Power for these systems is furnished from two separate 480 V distribution systems and a 13.8 kV distribution system. A defense in depth approach is used; interlocks are provided in the hardware to make it difficult to gain access to an energized circuit, and the operating procedure includes precautions which would protect personnel even if no interlocks were working. The complexity of the system implies a complex operating procedure, and this potential complexity is controlled by presenting the procedure in a modular form using 37 separate checklists for specific operations. The checklists are presented in flowchart form, so contingencies can be handled at the lowest possible level without compromising safety

  12. Computer model of the MFTF-B neutral beam Accel dc power supply

    International Nuclear Information System (INIS)

    Wilson, J.H.

    1983-01-01

    Using the SCEPTRE circuit modeling code, a computer model was developed for the MFTF Neutral Beam Power Supply System (NBPSS) Accel dc Power Supply (ADCPS). The ADCPS provides 90 kV, 88 A, to the Accel Modulator. Because of the complex behavior of the power supply, use of the computer model is necessary to adequately understand the power supply's behavior over a wide range of load conditions and faults. The model developed includes all the circuit components and parameters, and some of the stray values. The model has been well validated for transients with times on the order of milliseconds, and with one exception, for steady-state operation. When using a circuit modeling code for a system with a wide range of time constants, it can become impossible to obtain good solutions for all time ranges at once. The present model concentrates on the millisecond-range transients because the compensating capacitor bank tends to isolate the power supply from the load for faster transients. Attempts to include stray circuit elements with time constants in the microsecond and shorter range have had little success because of huge increases in computing time that result. The model has been successfully extended to include the accel modulator

  13. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume II. Integrated operations plan

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-01

    This document defines an integrated plan for the operation of the Lawrence Livermore National Laboratory (LLNL) Mirror Fusion Test Facility (MFTF-B). The plan fulfills and further delineates LLNL policies and provides for accomplishing the functions required by the program. This plan specifies the management, operations, maintenance, and engineering support responsibilities. It covers phasing into sustained operations as well as the sustained operations themselves. Administrative and Plant Engineering support, which are now being performed satisfactorily, are not part of this plan unless there are unique needs.

  14. Application of structural mechanics methods to the design of large tandem mirror fusion devices (MFTF-B)

    International Nuclear Information System (INIS)

    Karpenko, V.N.; Ng, D.S.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory requires state-of-the-art structural-mechanics methods to deal with access constraints for plasma heating and diagnostics, alignment requirements, and load complexity and variety. Large interactive structures required an integrated analytical approach to achieve a resonable level of overall system optimization. The Tandem Magnet Generator (TMG) creates a magnet configuration for the EFFI calculation of electromagnetic-field forces that, coupled with other loads, form the input loading to magnetic and vessel finite-element models. The anlytical results provide the data base for detailed design of magnet, vessel, foundation, and interaction effects. (orig.)

  15. Integrated operations plan for the MFTF-B Mirror Fusion Test Facility. Volume II. Integrated operations plan

    International Nuclear Information System (INIS)

    1981-12-01

    This document defines an integrated plan for the operation of the Lawrence Livermore National Laboratory (LLNL) Mirror Fusion Test Facility (MFTF-B). The plan fulfills and further delineates LLNL policies and provides for accomplishing the functions required by the program. This plan specifies the management, operations, maintenance, and engineering support responsibilities. It covers phasing into sustained operations as well as the sustained operations themselves. Administrative and Plant Engineering support, which are now being performed satisfactorily, are not part of this plan unless there are unique needs

  16. Sensor

    OpenAIRE

    Gleeson, Helen; Dierking, Ingo; Grieve, Bruce; Woodyatt, Christopher; Brimicombe, Paul

    2015-01-01

    An electrical temperature sensor (10) comprises a liquid crystalline material (12). First and second electrically conductive contacts (14), (16), having a spaced relationship there between, contact the liquid crystalline material (12). An electric property measuring device is electrically connected to the first and second contacts (14), (16) and is arranged to measure an electric property of the liquid crystalline material (12). The liquid crystalline material (12) has a transition temperatur...

  17. Static and dynamic analyses on the MFTF [Mirror Fusion Test Facility]-B Axicell Vacuum Vessel System: Final report

    International Nuclear Information System (INIS)

    Ng, D.S.

    1986-09-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory (LLNL) is a large-scale, tandem-mirror-fusion experiment. MFTF-B comprises many highly interconnected systems, including a magnet array and a vacuum vessel. The vessel, which houses the magnet array, is supported by reinforced concrete piers and steel frames resting on an array of foundations and surrounded by a 7-ft-thick concrete shielding vault. The Pittsburgh-Des Moines (PDM) Corporation, which was awarded the contract to design and construct the vessel, carried out fixed-base static and dynamic analyses of a finite-element model of the axicell vessel and magnet systems, including the simulation of various loading conditions and three postulated earthquake excitations. Meanwhile, LLNL monitored PDM's analyses with modeling studies of its own, and independently evaluated the structural responses of the vessel in order to define design criteria for the interface members and other project equipment. The assumptions underlying the finite-element model and the behavior of the axicell vessel are described in detail in this report, with particular emphasis placed on comparing the LLNL and PDM studies and on analyzing the fixed-base behavior with the soil-structure interaction, which occurs between the vessel and the massive concrete vault wall during a postulated seismic event. The structural members that proved sensitive to the soil effect are also reevaluated

  18. Design and Experimental Verification of a 0.19 V 53 μW 65 nm CMOS Integrated Supply-Sensing Sensor With a Supply-Insensitive Temperature Sensor and an Inductive-Coupling Transmitter for a Self-Powered Bio-sensing System Using a Biofuel Cell.

    Science.gov (United States)

    Kobayashi, Atsuki; Ikeda, Kei; Ogawa, Yudai; Kai, Hiroyuki; Nishizawa, Matsuhiko; Nakazato, Kazuo; Niitsu, Kiichi

    2017-12-01

    In this paper, we present a self-powered bio-sensing system with the capability of proximity inductive-coupling communication for supply sensing and temperature monitoring. The proposed bio-sensing system includes a biofuel cell as a power source and a sensing frontend that is associated with the CMOS integrated supply-sensing sensor. The sensor consists of a digital-based gate leakage timer, a supply-insensitive time-domain temperature sensor, and a current-driven inductive-coupling transmitter and achieves low-voltage operation. The timer converts the output voltage from a biofuel cell to frequency. The temperature sensor provides a pulse width modulation (PWM) output that is not dependent on the supply voltage, and the associated inductive-coupling transmitter enables proximity communication. A test chip was fabricated in 65 nm CMOS technology and consumed 53 μW with a supply voltage of 190 mV. The low-voltage-friendly design satisfied the performance targets of each integrated sensor without any trimming. The chips allowed us to successfully demonstrate proximity communication with an asynchronous receiver, and the measurement results show the potential for self-powered operation using biofuel cells. The analysis and experimental verification of the system confirmed their robustness.

  19. Design and test of-80 kV snubber core assemblies for MFTF sustaining-neutral-beam power supplies

    International Nuclear Information System (INIS)

    Bishop, S.R.; Mayhall, D.J.; Wilson, J.H.; De Vore, K.R.; Ross, R.I.; Sears, R.G.

    1981-01-01

    Core snubbers, located near the neutral beam source ends of the Mirror Fusion Test Facility (MFTF) Sustaining Neutral Beam Power Supply System (SNBPSS) source cables, protect the neutral beam source extractor grid wires from overheating and sputtering during internal sparkdowns. The snubbers work by producing an induced counter-emf which limits the fault current and by absorbing the capacitive energy stored on the 80 kV source cables and power supplies. A computer program STACAL was used in snubber magnetic design to choose appropriate tape wound cores to provide 400 Ω resistance and 25 J energy absorption. The cores are mounted horizontally in a dielectric structure. The central source cable bundle passes through the snubber and terminates on three copper buses. Multilam receptacles on the buses connect to the source module jumper cables. Corona rings and shields limit electric field stresses to allow close clearances between snubbers

  20. Engineering study of the neutral beam and rf heating systems for DIII-D, MFTF-B, JET, JT-60 and TFTR

    International Nuclear Information System (INIS)

    Lindquist, W.B.; Staten, S.H.

    1987-01-01

    An engineering study was performed on the rf and neutral beam heating systems implemented for DIII-D, MFTF-B, JET, JT-60 and TFTR. Areas covered include: methodology used to implement the systems, technology, cost, schedule, performance, problems encountered and lessons learned. Systems are compared and contrasted in the areas studied. Summary statements were made on common problems and lessons learned. 3 refs., 6 tabs

  1. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  2. Influence of surface position along the working range of conoscopic holography sensors on dimensional verification of AISI 316 wire EDM machined surfaces.

    Science.gov (United States)

    Fernández, Pedro; Blanco, David; Rico, Carlos; Valiño, Gonzalo; Mateos, Sabino

    2014-03-06

    Conoscopic holography (CH) is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F) and power (P) until the quality indicators Signal-to-Noise Ratio (SNR) and signal envelope (Total) meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR). Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP) on a CMM.

  3. Influence of Surface Position along the Working Range of Conoscopic Holography Sensors on Dimensional Verification of AISI 316 Wire EDM Machined Surfaces

    Directory of Open Access Journals (Sweden)

    Pedro Fernández

    2014-03-01

    Full Text Available Conoscopic holography (CH is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F and power (P until the quality indicators Signal-to-Noise Ratio (SNR and signal envelope (Total meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR. Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP on a CMM.

  4. A high sensitivity heterodyne interferometer as a possible optical readout for the LISA gravitational reference sensor and its application to technology verification

    Energy Technology Data Exchange (ETDEWEB)

    Gohlke, Martin; Weise, Dennis; Johann, Ulrich; Braxmaier, Claus [EADS Astrium, Claude-Dornier-Strasse, 88039 Friedrichshafen (Germany); Schuldt, Thilo; Peters, Achim, E-mail: martin.gohlke@astrium.eads.ne [Humboldt-Universitaet zu Berlin, Hausvogteiplatz 5-7, 10117 Berlin (Germany)

    2009-03-01

    The space-based gravitational wave detector LISA (Laser Interferometer Space Antenna) utilizes a high performance position sensor in order to measure the translation and tilt of the free flying proof mass with respect to the optical bench. Depending on the LISA optical bench design, this position sensor must have up to pm/sq rootHz sensitivity for the translation measurement and up to nrad/sq rootHz sensitivity for the tilt measurement. We developed a heterodyne interferometer, combined with differential wavefront sensing, for the tilt measurement. The interferometer design exhibits maximum symmetry where measurement and reference arm have the same frequency and polarization and the same optical path-lengths. The interferometer can be set up free of polarizing optical components preventing possible problems with thermal dependencies not suitable for the space environment. We developed a mechanically highly stable and compact setup which is located in a vacuum chamber. We measured initial noise levels below 10 pm/sq rootHz (longitudinal measurement) for frequencies above 10 mHz and below 20 nrad/sq rootHz (tilt measurement) for frequencies above 1 mHz. This setup can also be used for other applications, for example the measurement of the coefficient of thermal expansion (CTE) of structural materials, such as carbon fiber reinforced plastic (CFRP).

  5. Verification of small-scale water vapor features in VAS imagery using high resolution MAMS imagery. [VISSR Atmospheric Sounder - Multispectral Atmospheric Mapping Sensor

    Science.gov (United States)

    Menzel, Paul W.; Jedlovec, Gary; Wilson, Gregory

    1986-01-01

    The Multispectral Atmospheric Mapping Sensor (MAMS), a modification of NASA's Airborne Thematic Mapper, is described, and radiances from the MAMS and the VISSR Atmospheric Sounder (VAS) are compared which were collected simultaneously on May 18, 1985. Thermal emission from the earth atmosphere system in eight visible and three infrared spectral bands (12.3, 11.2 and 6.5 microns) are measured by the MAMS at up to 50 m horizontal resolution, and the infrared bands are similar to three of the VAS infrared bands. Similar radiometric performance was found for the two systems, though the MAMS showed somewhat less attenuation from water vapor than VAS because its spectral bands are shifted to shorter wavelengths away from the absorption band center.

  6. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  7. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  8. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  9. Application of structural-mechanics methods to the design of large tandem-mirror fusion devices (MFTF-B). Revision 1

    International Nuclear Information System (INIS)

    Karpenko, V.N.; Ng, D.S.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory requires state-of-the-art structural-mechanics methods to deal with access constraints for plasma heating and diagnostics, alignment requirements, and load complexity and variety. Large interactive structures required an integrated analytical approach to achieve a reasonable level of overall system optimization. The Tandem Magnet Generator (TMG) creates a magnet configuration for the EFFI calculation of electromagnetic-field forces that, coupled with other loads, form the input loading to magnet and vessel finite-element models. The analytical results provide the data base for detailed design of magnet, vessel, foundation, and interaction effects. 13 refs

  10. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  11. High field Nb/sub 3/Sn Axicell insert coils for the Mirror Fusion Test Facility-B (MFTF-B) axicell configuration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Baldi, R.W.; Tatro, R.E.; Scanlan, R.M.; Agarwal, K.L.; Bailey, R.E.; Burgeson, J.E.; Kim, I.K.; Magnuson, G.D.; Mallett, B.D.; Pickering, J.L.

    1984-03-01

    Two 12-tesla superconducting insert coils are being designed by General Dynamics Convair Division for the axicell regions of MFTF-B for Lawrence Livermore National Laboratory. A major challenge of this project is to ensure that combined fabrication and operational strains induced in the conductor are within stringent limitations of the relatively brittle Nb/sub 3/Sn superconductor filaments. These coils are located in the axicell region of MFTF-B. They have a clear-bore diameter of 36.195cm (14.25 inches) and consist of 27 double pancakes (i.e., 54 pancakes per coil) would on an electrically insulated 304LN stainless steel/bobbin helium vessel. Each pancake has 57 turns separated by G-10CR insulation. The complete winding bundle has 4.6 million ampere-turns and uniform current density of 2007 A/cm/sup 2/. In conjunction with the other magnets in the system, they produce a 12-tesla central field and a 12.52-tesla peak field. A multifilamentary Nb/sub 3/Sn conductor was selected to meet these requirements. The conductor consists of a monolithic insert soldered into a copper stabilizer. Sufficient cross-sectional area and work-hardening of the copper stabilizer has been provided for the conductor to self-react the electromagnetic Lorentz force induced hoop stresses with normal operational tensile strains less than 0.07 percent.

  12. High field Nb3Sn Axicell insert coils for the Mirror Fusion Test Facility-B (MFTF-B) axicell configuration. Final report

    International Nuclear Information System (INIS)

    Baldi, R.W.; Tatro, R.E.; Scanlan, R.M.

    1984-03-01

    Two 12-tesla superconducting insert coils are being designed by General Dynamics Convair Division for the axicell regions of MFTF-B for Lawrence Livermore National Laboratory. A major challenge of this project is to ensure that combined fabrication and operational strains induced in the conductor are within stringent limitations of the relatively brittle Nb 3 Sn superconductor filaments. These coils are located in the axicell region of MFTF-B. They have a clear-bore diameter of 36.195cm (14.25 inches) and consist of 27 double pancakes (i.e., 54 pancakes per coil) would on an electrically insulated 304LN stainless steel/bobbin helium vessel. Each pancake has 57 turns separated by G-10CR insulation. The complete winding bundle has 4.6 million ampere-turns and uniform current density of 2007 A/cm 2 . In conjunction with the other magnets in the system, they produce a 12-tesla central field and a 12.52-tesla peak field. A multifilamentary Nb 3 Sn conductor was selected to meet these requirements. The conductor consists of a monolithic insert soldered into a copper stabilizer. Sufficient cross-sectional area and work-hardening of the copper stabilizer has been provided for the conductor to self-react the electromagnetic Lorentz force induced hoop stresses with normal operational tensile strains less than 0.07 percent

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  15. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  16. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  17. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  18. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  19. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  20. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  1. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  2. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  3. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  4. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  5. BepiColombo fine sun sensor

    Science.gov (United States)

    Boslooper, Erik; van der Heiden, Nico; Naron, Daniël.; Schmits, Ruud; van der Velde, Jacob Jan; van Wakeren, Jorrit

    2017-11-01

    Design, development and verification of the passive Fine Sun Sensor (FSS) for the BepiColombo spacecraft is described. Major challenge in the design is to keep the detector at acceptable temperature levels while exposed to a solar flux intensity exceeding 10 times what is experienced in Earth orbit. A mesh type Heat Rejection Filter has been developed. The overall sensor design and its performance verification program is described.

  6. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  7. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  8. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 £ 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  9. MFTF magnet cryogenics

    International Nuclear Information System (INIS)

    VanSant, J.H.

    1981-07-01

    The prime requirement of the cryogenics of the magnets is to assure a superconducting state for the magnet coils, a large task considering their enormous size. The following presentation addresses the principal topics that have been considered in this cryogenic design

  10. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  11. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  12. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  13. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  16. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  17. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  18. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  19. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  20. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  1. Computer code determination of tolerable accel current and voltage limits during startup of an 80 kV MFTF sustaining neutral beam source

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Eckard, R.D.

    1979-01-01

    We have used a Lawrence Livermore Laboratory (LLL) version of the WOLF ion source extractor design computer code to determine tolerable accel current and voltage limits during startup of a prototype 80 kV Mirror Fusion Test Facility (MFTF) sustaining neutral beam source. Arc current limits are also estimated. The source extractor has gaps of 0.236, 0.721, and 0.155 cm. The effective ion mass is 2.77 AMU. The measured optimum accel current density is 0.266 A/cm 2 . The gradient grid electrode runs at 5/6 V/sub a/ (accel voltage). The suppressor electrode voltage is zero for V/sub a/ < 3 kV and -3 kV for V/sub a/ greater than or equal to 3 kV. The accel current density for optimum beam divergence is obtained for 1 less than or equal to V/sub a/ less than or equal to 80 kV, as are the beam divergence and emittance

  2. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  3. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  4. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  5. A review of technology for verification of waste removal from Hanford Underground Storage Tanks (WHC Issue 30)

    International Nuclear Information System (INIS)

    Thunborg, S.

    1994-09-01

    Remediation of waste from Underground Storage Tanks (UST) at the Hanford Waste storage sites will require removal of all waste to a nearly clean condition. Current requirements are 99% clean. In order to meet remediation legal requirements, a means to remotely verify that the waste has been removed to sufficient level is needed. This report discusses the requirements for verification and reviews major technologies available for inclusion in a verification system. The report presents two operational scenarios for verification of residual waste volume. Thickness verification technologies reviewed are Ultrasonic Sensors, Capacitance Type Sensors, Inductive Sensors, Ground Penetrating Radar, and Magnetometers. Of these technologies Inductive (Metal Detectors) and Ground Penetrating Radar appear to be the most suitable for use as waste thickness sensors

  6. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  7. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  8. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  9. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  10. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  11. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  12. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  13. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  14. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  15. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  16. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  17. Open and Crowd-Sourced Data for Treaty Verification

    Science.gov (United States)

    2014-10-01

    cations – from enhancing home security to providing novel marketing tools for commerce – they are widely available and inexpensive. These open...we anticipate increasing the density of public-domain seis- mic sensor coverage in regions where high population density coincides with seismic hazards...interface and metadata standards emerge through the med- ical device market , it makes sense to adhere to these standards for any verification-optimized

  18. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  20. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  1. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  2. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  3. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  4. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  5. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  6. Taste sensor; Mikaku sensor

    Energy Technology Data Exchange (ETDEWEB)

    Toko, K. [Kyushu University, Fukuoka (Japan)

    1998-03-05

    This paper introduces a taste sensor having a lipid/polymer membrane to work as a receptor of taste substances. The paper describes the following matters: this sensor uses a hollow polyvinyl chloride rod filled with KCl aqueous solution, and placed with silver and silver chloride wires, whose cross section is affixed with a lipid/polymer membrane as a lipid membrane electrode to identify taste from seven or eight kinds of response patterns of electric potential output from the lipid/polymer membrane; measurements of different substances presenting acidic taste, salty taste, bitter taste, sweet taste and flavor by using this sensor identified clearly each taste (similar response is shown to a similar taste even if the substances are different); different responses are indicated on different brands of beers; from the result of measuring a great variety of mineral waters, a possibility was suggested that this taste sensor could be used for water quality monitoring sensors; and application of this taste sensor may be expected as a maturation control sensor for Japanese sake (wine) and miso (bean paste) manufacturing. 2 figs., 1 tab.

  7. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  8. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  9. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  10. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  11. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  12. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  13. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  14. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  15. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  16. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  17. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  18. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  19. Ambient Sensors

    NARCIS (Netherlands)

    Börner, Dirk; Specht, Marcus

    2014-01-01

    This software sketches comprise two custom-built ambient sensors, i.e. a noise and a movement sensor. Both sensors measure an ambient value and process the values to a color gradient (green > yellow > red). The sensors were built using the Processing 1.5.1 development environment. Available under

  20. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  1. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  2. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  3. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  4. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  5. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  6. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  7. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  8. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  9. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  10. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  11. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  12. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  13. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  14. Versatile timing system for MFTF

    International Nuclear Information System (INIS)

    Lau, N.H.C.

    1981-01-01

    This System consists of the Master Timing Transmitter and the Local Timing Receivers. The Master Timing Transmitter located in the control room initiates timing messages, abort messages and precise delay messages. A sync message is sent when one of the other three is not being sent. The Local Timing Receiver, located in the equipment area, decodes the incoming messages and generates 6 MHz, 3MHz and 1 MHz continuous clocks. A 250 KHz sync clock is derived from the sync messages, to which all pulse outputs are synchronized. The Local Timing Receiver also provides two ON-OFF delay counters of 64 bits each, and one OFF delay counter of 32 bits. Detection of abort messages and an out-of-sync signal will automatically disable all outputs

  15. Fiber-optic laser sensor for mine detection and verification

    International Nuclear Information System (INIS)

    Bohling, Christian; Scheel, Dirk; Hohmann, Konrad; Schade, Wolfgang; Reuter, Matthias; Holl, Gerhard

    2006-01-01

    What we believe to be a new optical approach for the identification of mines and explosives by analyzing the surface materials and not only bulk is developed. A conventional manually operated mine prodder is upgraded by laser-induced breakdown spectroscopy (LIBS). In situ and real-time information of materials that are in front of the prodder are obtained during the demining process in order to optimize the security aspects and the speed of demining. A Cr4+:Nd3+:YAG microchip laser is used as a seed laser for an ytterbium-fiber amplifier to generate high-power laser pulses at 1064 nm with pulse powers up to Ep=1 mJ, a repetition rate of frep.=2-20 kHz and a pulse duration of tp=620 ps. The recorded LIBS signals are analyzed by applying neural networks for the data analysis

  16. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  17. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  18. Smartphone User Identity Verification Using Gait Characteristics

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2016-09-01

    Full Text Available Smartphone-based biometrics offers a wide range of possible solutions, which could be used to authenticate users and thus to provide an extra level of security and theft prevention. We propose a method for positive identification of smartphone user’s identity using user’s gait characteristics captured by embedded smartphone sensors (gyroscopes, accelerometers. The method is based on the application of the Random Projections method for feature dimensionality reduction to just two dimensions. Then, a probability distribution function (PDF of derived features is calculated, which is compared against known user PDF. The Jaccard distance is used to evaluate distance between two distributions, and the decision is taken based on thresholding. The results for subject recognition are at an acceptable level: we have achieved a grand mean Equal Error Rate (ERR for subject identification of 5.7% (using the USC-HAD dataset. Our findings represent a step towards improving the performance of gait-based user identity verification technologies.

  19. Biomimetic actuator and sensor for robot hand

    International Nuclear Information System (INIS)

    Kim, Baekchul; Chung, Jinah; Cho, Hanjoung; Shin, Seunghoon; Lee, Hyoungsuk; Moon, Hyungpil; Choi, Hyouk Ryeol; Koo, Jachoon

    2012-01-01

    To manufacture a robot hand that essentially mimics the functions of a human hand, it is necessary to develop flexible actuators and sensors. In this study, we propose the design, manufacture, and performance verification of flexible actuators and sensors based on Electro Active Polymer (EAP). EAP is fabricated as a type of film, and it moves with changes in the voltage because of contraction and expansion in the polymer film. Furthermore, if a force is applied to an EAP film, its thickness and effective area change, and therefore, the capacitance also changes. By using this mechanism, we produce capacitive actuators and sensors. In this study, we propose an EAP based capacitive sensor and evaluate its use as a robot hand sensor

  20. Biomimetic actuator and sensor for robot hand

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Baekchul; Chung, Jinah; Cho, Hanjoung; Shin, Seunghoon; Lee, Hyoungsuk; Moon, Hyungpil; Choi, Hyouk Ryeol; Koo, Jachoon [Sungkyunkwan Univ., Seoul (Korea, Republic of)

    2012-12-15

    To manufacture a robot hand that essentially mimics the functions of a human hand, it is necessary to develop flexible actuators and sensors. In this study, we propose the design, manufacture, and performance verification of flexible actuators and sensors based on Electro Active Polymer (EAP). EAP is fabricated as a type of film, and it moves with changes in the voltage because of contraction and expansion in the polymer film. Furthermore, if a force is applied to an EAP film, its thickness and effective area change, and therefore, the capacitance also changes. By using this mechanism, we produce capacitive actuators and sensors. In this study, we propose an EAP based capacitive sensor and evaluate its use as a robot hand sensor.

  1. Attention Sensor

    NARCIS (Netherlands)

    Börner, Dirk; Kalz, Marco; Specht, Marcus

    2014-01-01

    This software sketch was used in the context of an experiment for the PhD project “Ambient Learning Displays”. The sketch comprises a custom-built attention sensor. The sensor measured (during the experiment) whether a participant looked at and thus attended a public display. The sensor was built

  2. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  3. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  4. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  5. Container Verification Using Optically Stimulated Luminescence

    International Nuclear Information System (INIS)

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-01-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today's technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  6. Sensors, Volume 4, Thermal Sensors

    Science.gov (United States)

    Scholz, Jorg; Ricolfi, Teresio

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume describes the construction and applicational aspects of thermal sensors while presenting a rigorous treatment of the underlying physical principles. It provides a unique overview of the various categories of sensors as well as of specific groups, e.g. temperature sensors (resistance thermometers, thermocouples, and radiation thermometers), noise and acoustic thermometers, heat-flow and mass-flow sensors. Specific facettes of applications are presented by specialists from different fields including process control, automotive technology and cryogenics. This volume is an indispensable reference work and text book for both specialists and newcomers, researchers and developers.

  7. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    Directory of Open Access Journals (Sweden)

    Helala AlShehri

    2018-03-01

    Full Text Available The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  8. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    Science.gov (United States)

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  9. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  10. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  11. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  12. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  13. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  14. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  15. Gas Sensor

    KAUST Repository

    Luebke, Ryan

    2015-01-22

    A gas sensor using a metal organic framework material can be fully integrated with related circuitry on a single substrate. In an on-chip application, the gas sensor can result in an area-efficient fully integrated gas sensor solution. In one aspect, a gas sensor can include a first gas sensing region including a first pair of electrodes, and a first gas sensitive material proximate to the first pair of electrodes, wherein the first gas sensitive material includes a first metal organic framework material.

  16. Gas Sensor

    KAUST Repository

    Luebke, Ryan; Eddaoudi, Mohamed; Omran, Hesham; Belmabkhout, Youssef; Shekhah, Osama; Salama, Khaled N.

    2015-01-01

    A gas sensor using a metal organic framework material can be fully integrated with related circuitry on a single substrate. In an on-chip application, the gas sensor can result in an area-efficient fully integrated gas sensor solution. In one aspect, a gas sensor can include a first gas sensing region including a first pair of electrodes, and a first gas sensitive material proximate to the first pair of electrodes, wherein the first gas sensitive material includes a first metal organic framework material.

  17. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  18. Measurement and Verification of Energy Savings and Performance from Advanced Lighting Controls

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-02-21

    This document provides a framework for measurement and verification (M&V) of energy savings, performance, and user satisfaction from lighting retrofit projects involving occupancy-sensor-based, daylighting, and/or other types of automatic lighting. It was developed to provide site owners, contractors, and other involved organizations with the essential elements of a robust M&V plan for retrofit projects and to assist in developing specific project M&V plans.

  19. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  20. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  1. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  2. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  3. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  4. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  5. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  6. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  7. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  8. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  9. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  10. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  11. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  12. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  13. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  14. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  15. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  16. Chemical sensors

    International Nuclear Information System (INIS)

    Hubbard, C.W.; Gordon, R.L.

    1987-05-01

    The revolution in analytical chemistry promised by recent developments in the field of chemical sensors has potential for significant positive impact on both research and production activities conducted by and for the Department of Energy. Analyses which were, in the past, performed only with a roomful of expensive equipment can now be performed with miniature solid-state electronic devices or small optical probes. Progress in the development of chemical sensors has been rapid, and the field is currently growing at a great rate. In accordance, Pacific Northwest Laboratory initiated a survey of recent literature so that contributors to active programs in research on analytical methods could be made aware of principles and applications of this new technology. This report presents the results of that survey. The sensors discussed here are divided into three types: micro solid-state devices, optical sensors, and piezoelectric crystal devices. The report is divided into three corresponding sections. The first section, ''Micro Solid-State Devices,'' discusses the design, operation, and application of electronic sensors that are produced in much the same way as standard solid-state electronic devices. The second section, ''Optrodes,'' covers the design and operation of chemical sensors that use fiber optics to detect chemically induced changes in optical properties. The final section, ''Piezoelectric Crystal Detectors,'' discusses two types of chemical sensors that depend on the changes in the properties of an oscillating piezoelectric crystal to detect the presence of certain materials. Advantages and disadvantages of each type of sensor are summarized in each section

  17. Sensor and Communication Network Technology for Harsh Environments in the Nuclear Power Plant

    International Nuclear Information System (INIS)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Park, Hee Yoon; Hong, Seok Bong; Koo, In Soo

    2008-02-01

    One of the challenges in harsh environments qualification and verification for emerging new I and C system of the nuclear power plant is to define the operational environment of these new emerging I and C sensor and communication network such that they are tested to the limits of a mission without requiring expensive over design. To aid this, this report defines, discusses and recommends environmental guideline and verification requirements for using state-of-the-art RPS sensors, fiber optic communication system, wireless communication and wireless smart sensors in nuclear harsh environments. This report focuses on advances in sensors (e.g., temperature, pressure, neutron and thermal power sensors) and their potential impact. Discussed are: radiation, thermal, electromagnetic, and electrical environment specifications. Presented are the typical performance data (survivability guidelines and experimental data), evaluation procedure and standard test method of communication devices, state-of-the-art RPS sensors, and communication systems

  18. Sensor and Communication Network Technology for Harsh Environments in the Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Park, Hee Yoon; Hong, Seok Bong; Koo, In Soo

    2008-02-15

    One of the challenges in harsh environments qualification and verification for emerging new I and C system of the nuclear power plant is to define the operational environment of these new emerging I and C sensor and communication network such that they are tested to the limits of a mission without requiring expensive over design. To aid this, this report defines, discusses and recommends environmental guideline and verification requirements for using state-of-the-art RPS sensors, fiber optic communication system, wireless communication and wireless smart sensors in nuclear harsh environments. This report focuses on advances in sensors (e.g., temperature, pressure, neutron and thermal power sensors) and their potential impact. Discussed are: radiation, thermal, electromagnetic, and electrical environment specifications. Presented are the typical performance data (survivability guidelines and experimental data), evaluation procedure and standard test method of communication devices, state-of-the-art RPS sensors, and communication systems.

  19. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  20. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  1. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  2. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  3. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  4. Currency verification by a 2D infrared barcode

    International Nuclear Information System (INIS)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-01-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler. (technical design note)

  5. Automotive sensors

    Science.gov (United States)

    Marek, Jiri; Illing, Matthias

    2003-01-01

    Sensors are an essential component of most electronic systems in the car. They deliver input parameters for comfort features, engine and emission control as well as for the active and passive safety systems. New technologies such as silicon micromachining play an important role for the introduction of these sensors in all vehicle classes. The importance and use of these sensor technologies in today"s automotive applications will be shown in this article. Finally an outlook on important current developments and new functions in the car will be given.

  6. Piezoceramic Sensors

    CERN Document Server

    Sharapov, Valeriy

    2011-01-01

    This book presents the latest and complete information about various types of piezosensors. A sensor is a converter of the measured physical size to an electric signal. Piezoelectric transducers and sensors are based on piezoelectric effects. They have proven to be versatile tools for the measurement of various processes. They are used for quality assurance, process control and for research and development in many different industries. In each area of application specific requirements to the parameters of transducers and sensors are developed. This book presents the fundamentals, technical des

  7. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  8. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  9. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  10. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  11. Optischer Sensor

    OpenAIRE

    Brandenburg, A.; Hutter, F.; Edelhaeuser, R.

    1992-01-01

    WO 2010040565 A1 UPAB: 20100506 NOVELTY - The integrated optical sensor comprises a first waveguide (4), a second waveguide (5) optically coupled to the first waveguide via a directional coupler, a substrate, which carries the first and the second waveguides, a single waveguide coupled with a light source, and an output waveguide coupled with a light-sensitive element. The sensor has a functional surface in the region of the directional coupler for depositing or deposition of the substance to...

  12. Wireless sensor

    Science.gov (United States)

    Lamberti, Vincent E.; Howell, JR, Layton N.; Mee, David K.; Sepaniak, Michael J.

    2016-02-09

    Disclosed is a sensor for detecting a target material. The sensor includes a ferromagnetic metal and a molecular recognition reagent coupled to the ferromagnetic metal. The molecular recognition reagent is operable to expand upon exposure to vapor or liquid from the target material such that the molecular recognition reagent changes a tensile stress upon the ferromagnetic metal. The target material is detected based on changes in the magnetic switching characteristics of the ferromagnetic metal caused by the changes in the tensile stress.

  13. Threats of Password Pattern Leakage Using Smartwatch Motion Recognition Sensors

    Directory of Open Access Journals (Sweden)

    Jihun Kim

    2017-06-01

    Full Text Available Thanks to the development of Internet of Things (IoT technologies, wearable markets have been growing rapidly. Smartwatches can be said to be the most representative product in wearable markets, and involve various hardware technologies in order to overcome the limitations of small hardware. Motion recognition sensors are a representative example of those hardware technologies. However, smartwatches and motion recognition sensors that can be worn by users may pose security threats of password pattern leakage. In the present paper, passwords are inferred through experiments to obtain password patterns inputted by users using motion recognition sensors, and verification of the results and the accuracy of the results is shown.

  14. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    Science.gov (United States)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  15. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  16. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  17. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  18. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  19. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  20. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  1. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  2. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  3. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  4. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  5. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  6. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  7. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  8. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  9. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  10. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  11. Advances in the Processing of VHR Optical Imagery in Support of Safeguards Verification

    International Nuclear Information System (INIS)

    Niemeyer, I.; Listner, C.; Canty, M.

    2015-01-01

    Under the Additional Protocol of the Non-Proliferation Treaty (NPT) complementing the safeguards agreements between States and the International Atomic Energy Agency, commercial satellite imagery, preferably acquired by very high-resolution (VHR) satellite sensors, is an important source of safeguards-relevant information. Satellite imagery can assist in the evaluation of site declarations, design information verification, the detection of undeclared nuclear facilities, and the preparation of inspections or other visits. With the IAEA's Geospatial Exploitation System (GES), satellite imagery and other geospatial information such as site plans of nuclear facilities are available for a broad range of inspectors, analysts and country officers. The demand for spatial information and new tools to analyze this data is growing, together with the rising number of nuclear facilities under safeguards worldwide. Automated computer-driven processing of satellite imagery could therefore add a big value in the safeguards verification process. These could be, for example, satellite imagery pre-processing algorithms specially developed for new sensors, tools for pixel or object-based image analysis, or geoprocessing tools that generate additional safeguards-relevant information. In the last decade procedures for automated (pre-) processing of satellite imagery have considerably evolved. This paper aims at testing some pixel-based and object-based procedures for automated change detection and classification in support of safeguards verification. Taking different nuclear sites as examples, these methods will be evaluated and compared with regard to their suitability to (semi-) automatically extract safeguards-relevant information. (author)

  12. Verification of Remote Inspection Techniques for Reactor Internal Structures of Liquid Metal Reactor

    International Nuclear Information System (INIS)

    Joo, Young Sang; Lee, Jae Han

    2007-02-01

    The reactor internal structures and components of a liquid metal reactor (LMR) are submerged in hot sodium of reactor vessel. The division 3 of ASME code section XI specifies the visual inspection as major in-service inspection (ISI) methods of reactor internal structures and components. Reactor internals of LMR can not be visually examined due to opaque liquid sodium. The under-sodium viewing techniques using an ultrasonic wave should be applied for the visual inspection of reactor internals. Recently, an ultrasonic waveguide sensor with a strip plate has been developed for an application to the under-sodium inspection. In this study, visualization technique, ranging technique and monitoring technique have been suggested for the remote inspection of reactor internals by using the waveguide sensor. The feasibility of these remote inspection techniques using ultrasonic waveguide sensor has been evaluated by an experimental verification

  13. Verification of Remote Inspection Techniques for Reactor Internal Structures of Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Young Sang; Lee, Jae Han

    2007-02-15

    The reactor internal structures and components of a liquid metal reactor (LMR) are submerged in hot sodium of reactor vessel. The division 3 of ASME code section XI specifies the visual inspection as major in-service inspection (ISI) methods of reactor internal structures and components. Reactor internals of LMR can not be visually examined due to opaque liquid sodium. The under-sodium viewing techniques using an ultrasonic wave should be applied for the visual inspection of reactor internals. Recently, an ultrasonic waveguide sensor with a strip plate has been developed for an application to the under-sodium inspection. In this study, visualization technique, ranging technique and monitoring technique have been suggested for the remote inspection of reactor internals by using the waveguide sensor. The feasibility of these remote inspection techniques using ultrasonic waveguide sensor has been evaluated by an experimental verification.

  14. Radiation sensor

    International Nuclear Information System (INIS)

    Brown, W.L.; Geronime, R.L.

    1977-01-01

    Radiation sensor and thermocouple, respectively, which can be used for reactor in-core instrumentation. The radiation sensor consists of an inconel conductor wire and rhodium emitter wire, the thermocouple of two intertwined alumel or chromel wires. Both are arranged in the center of a metal tube relative to which they are separated by an insulator made of SiO 2 fibers. This insulator is first introduced as a loose fabric between the radiation sensor and the thermocouple, respectively, and the metal tube and then compacted to a density of 35-73% of pure SiO 2 by drawing the tube. There is no need for soldering or welding. The insulation resistivity at room temperature ist between 10 14 and 10 15 ohms. (ORU) [de

  15. Water Sensors

    Science.gov (United States)

    1992-01-01

    Mike Morris, former Associate Director of STAC, formed pHish Doctor, Inc. to develop and sell a pH monitor for home aquariums. The monitor, or pHish Doctor, consists of a sensor strip and color chart that continually measures pH levels in an aquarium. This is important because when the level gets too high, ammonia excreted by fish is highly toxic; at low pH, bacteria that normally break down waste products stop functioning. Sales have run into the tens of thousands of dollars. A NASA Tech Brief Technical Support Package later led to a salt water version of the system and a DoE Small Business Innovation Research (SBIR) grant for development of a sensor for sea buoys. The company, now known as Ocean Optics, Inc., is currently studying the effects of carbon dioxide buildup as well as exploring other commercial applications for the fiber optic sensor.

  16. TomoTherapy MLC verification using exit detector data

    Energy Technology Data Exchange (ETDEWEB)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu [TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado 80045 (United States); Xinghua Cancer Hospital, Xinghua, Jiangsu 225700 (China); Department of Radiation Oncology, University of California-Los Angeles, Los Angeles, California 90095 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States)

    2012-01-15

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment

  17. TomoTherapy MLC verification using exit detector data

    International Nuclear Information System (INIS)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu

    2012-01-01

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment systems

  18. Practical Use Technique of Sensor

    International Nuclear Information System (INIS)

    Hwang, Gyu Seop

    1985-11-01

    This book tells of practical use technology of sensor, introducing the recent trend of sensor for electronic industry, IC temperature sensor, radiation temperature sensor of surface acoustic wave, optical fiber temperature sensor, a polyelectrolyte film humidity sensor, semiconductor pressure sensor for industrial instrumentation, silicon integration pressure sensor, thick film humidity sensor and its application, photo sensor reflection type, and color sensor. It also deals with sensor for FA, sensor for a robot and sensor for the chemical industry.

  19. Practical Use Technique of Sensor

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Gyu Seop

    1985-11-15

    This book tells of practical use technology of sensor, introducing the recent trend of sensor for electronic industry, IC temperature sensor, radiation temperature sensor of surface acoustic wave, optical fiber temperature sensor, a polyelectrolyte film humidity sensor, semiconductor pressure sensor for industrial instrumentation, silicon integration pressure sensor, thick film humidity sensor and its application, photo sensor reflection type, and color sensor. It also deals with sensor for FA, sensor for a robot and sensor for the chemical industry.

  20. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  1. Chemical sensor

    Science.gov (United States)

    Rauh, R. David (Inventor)

    1990-01-01

    A sensor for detecting a chemical substance includes an insertion element having a structure which enables insertion of the chemical substance with a resulting change in the bulk electrical characteristics of the insertion element under conditions sufficient to permit effective insertion; the change in the bulk electrical characteristics of the insertion element is detected as an indication of the presence of the chemical substance.

  2. Load sensor

    NARCIS (Netherlands)

    Van den Ende, D.; Almeida, P.M.R.; Dingemans, T.J.; Van der Zwaag, S.

    2007-01-01

    The invention relates to a load sensor comprising a polymer matrix and a piezo-ceramic material such as PZT, em not bedded in the polymer matrix, which together form a compos not ite, wherein the polymer matrix is a liquid crystalline resin, and wherein the piezo-ceramic material is a PZT powder

  3. Gas sensor

    Science.gov (United States)

    Schmid, Andreas K.; Mascaraque, Arantzazu; Santos, Benito; de la Figuera, Juan

    2014-09-09

    A gas sensor is described which incorporates a sensor stack comprising a first film layer of a ferromagnetic material, a spacer layer, and a second film layer of the ferromagnetic material. The first film layer is fabricated so that it exhibits a dependence of its magnetic anisotropy direction on the presence of a gas, That is, the orientation of the easy axis of magnetization will flip from out-of-plane to in-plane when the gas to be detected is present in sufficient concentration. By monitoring the change in resistance of the sensor stack when the orientation of the first layer's magnetization changes, and correlating that change with temperature one can determine both the identity and relative concentration of the detected gas. In one embodiment the stack sensor comprises a top ferromagnetic layer two mono layers thick of cobalt deposited upon a spacer layer of ruthenium, which in turn has a second layer of cobalt disposed on its other side, this second cobalt layer in contact with a programmable heater chip.

  4. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  5. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  6. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  7. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  8. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  9. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  10. Semiconductor sensors

    International Nuclear Information System (INIS)

    Hartmann, Frank

    2011-01-01

    Semiconductor sensors have been around since the 1950s and today, every high energy physics experiment has one in its repertoire. In Lepton as well as Hadron colliders, silicon vertex and tracking detectors led to the most amazing physics and will continue doing so in the future. This contribution tries to depict the history of these devices exemplarily without being able to honor all important developments and installations. The current understanding of radiation damage mechanisms and recent R and D topics demonstrating the future challenges and possible technical solutions for the SLHC detectors are presented. Consequently semiconductor sensor candidates for an LHC upgrade and a future linear collider are also briefly introduced. The work presented here is a collage of the work of many individual silicon experts spread over several collaborations across the world.

  11. Load sensor

    OpenAIRE

    Van den Ende, D.; Almeida, P.M.R.; Dingemans, T.J.; Van der Zwaag, S.

    2007-01-01

    The invention relates to a load sensor comprising a polymer matrix and a piezo-ceramic material such as PZT, em not bedded in the polymer matrix, which together form a compos not ite, wherein the polymer matrix is a liquid crystalline resin, and wherein the piezo-ceramic material is a PZT powder forming 30-60% by volume of the composite, and wherein the PZT powder forms 40-50% by volume of the composite.

  12. Image Sensor

    OpenAIRE

    Jerram, Paul; Stefanov, Konstantin

    2017-01-01

    An image sensor of the type for providing charge multiplication by impact ionisation has plurality of multiplication elements. Each element is arranged to receive charge from photosensitive elements of an image area and each element comprises a sequence of electrodes to move charge along a transport path. Each of the electrodes has an edge defining a boundary with a first electrode, a maximum width across the charge transport path and a leading edge that defines a boundary with a second elect...

  13. Optischer Sensor

    OpenAIRE

    Brandenburg, A.; Fischer, A.

    1995-01-01

    An optical sensor (1) comprising an integrated optical arrangement has a waveguide (4) and at least one defraction grating (5) arranged in this waveguide. Light can launched into the waveguide via the defraction grating. In the reflection area of defraction grating, part of the light is dispersed through the waveguide at the beam angle for which the launch conditions and thus the defraction in the waveguide are fulfilled, so that, at this angle, a dark line (14) occurs whose position is evalu...

  14. Gas sensor

    International Nuclear Information System (INIS)

    Dorogan, V.; Korotchenkov, Gh.; Vieru, T.; Prodan, I.

    2003-01-01

    The invention relates to the gas sensors on base of metal-oxide films (SnO, InO), which may be used for enviromental control, in the fireextinguishing systema etc. The gas includes an insulating substrate, an active layer, a resistive layer with ohmic contacts. The resistive layer has two or more regions with dofferent resistances , and on the active layer are two or more pairs of ohmic contacts

  15. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  16. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  17. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  18. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  19. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  20. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  1. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  2. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  3. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  4. Development of a Meso-Scale Fiberoptic Rotation Sensor for a Torsion Actuator.

    Science.gov (United States)

    Sheng, Jun; Desai, Jaydev P

    2018-01-01

    This paper presents the development of a meso-scale fiberoptic rotation sensor for a shape memory alloy (SMA) torsion actuator for neurosurgical applications. Within the sensor, a rotary head with a reflecting surface is capable of modulating the light intensity collected by optical fibers when the rotary head is coupled to the torsion actuator. The mechanism of light intensity modulation is modeled, followed by experimental model verification. Meanwhile, working performances for different rotary head designs, optical fibers, and fabrication materials are compared. After the calibration of the fiberoptic rotation sensor, the sensor is capable of precisely measuring rotary motion and controlling the SMA torsion actuator with feedback control.

  5. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  6. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  7. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  8. Optical Imaging Sensors and Systems for Homeland Security Applications

    CERN Document Server

    Javidi, Bahram

    2006-01-01

    Optical and photonic systems and devices have significant potential for homeland security. Optical Imaging Sensors and Systems for Homeland Security Applications presents original and significant technical contributions from leaders of industry, government, and academia in the field of optical and photonic sensors, systems and devices for detection, identification, prevention, sensing, security, verification and anti-counterfeiting. The chapters have recent and technically significant results, ample illustrations, figures, and key references. This book is intended for engineers and scientists in the relevant fields, graduate students, industry managers, university professors, government managers, and policy makers. Advanced Sciences and Technologies for Security Applications focuses on research monographs in the areas of -Recognition and identification (including optical imaging, biometrics, authentication, verification, and smart surveillance systems) -Biological and chemical threat detection (including bios...

  9. First investigations on the safety evaluation of smart sensors

    International Nuclear Information System (INIS)

    Bousquet, S.; Elsensohn, O.

    2001-10-01

    IPSN (Institute for Protection and Nuclear Safety) is the technical support for the French nuclear safety authority and thus involved in the safety evaluation of new I and C technologies and particularly of smart sensors. Smart sensors are characterized by the use of a microprocessor that converts the process variable into digital signals and exchanges other information with I and C control systems. There are two types of smart sensors: HART (Highway Addressable Remote Transducer) sensors, which provide both analogue (4 to 20 mA) and digital signals, and network sensors, which provide only digital signals. The expected benefits for operators are improved accuracy and reliability and cost savings in installation, commissioning, testing and maintenance. Safety evaluation of these smart sensors raises new issues: How does the sensor react to unknown commands? How to avoid unexpected changes in configuration? What is its sensitivity to electromagnetic interferences (EMI), to radiations...? In order to evaluate whether these sensors can be qualified for a safety application and to define the qualification tests to be done, IPSN has planned some functional and hardware tests (EMI, radiations) on 'HART' and field bus sensors. During the functional tests, we were not able to disrupt the HART tested sensors by invalid commands. However, these results cannot be extended to other sensors, because of the use of different technology, of different versions of hardware and software and of constructors' specific commands. Furthermore, easy modifications of configuration parameters can cause additional failures. Environmental tests are in progress on HART sensors and will be followed by experiments on field bus sensors. These preliminary investigations and the latest incident initiated by an incorrect computing algorithm of digital switchgear at Ringhals NPP, clearly illustrate that testing and verification programmes for smart equipment must be meticulously designed and reviewed

  10. First investigations on the safety evaluation of smart sensors

    Energy Technology Data Exchange (ETDEWEB)

    Bousquet, S.; Elsensohn, O. [CEA Fontenay aux Roses, 92 (France). Inst. de Protection et de Surete Nucleaire; Benoit, G. [CEA Saclay, Dir. de la Recherche Technologique DRT, 91 - Gif sur Yvette (France)

    2001-10-01

    IPSN (Institute for Protection and Nuclear Safety) is the technical support for the French nuclear safety authority and thus involved in the safety evaluation of new I and C technologies and particularly of smart sensors. Smart sensors are characterized by the use of a microprocessor that converts the process variable into digital signals and exchanges other information with I and C control systems. There are two types of smart sensors: HART (Highway Addressable Remote Transducer) sensors, which provide both analogue (4 to 20 mA) and digital signals, and network sensors, which provide only digital signals. The expected benefits for operators are improved accuracy and reliability and cost savings in installation, commissioning, testing and maintenance. Safety evaluation of these smart sensors raises new issues: How does the sensor react to unknown commands? How to avoid unexpected changes in configuration? What is its sensitivity to electromagnetic interferences (EMI), to radiations...? In order to evaluate whether these sensors can be qualified for a safety application and to define the qualification tests to be done, IPSN has planned some functional and hardware tests (EMI, radiations) on 'HART' and field bus sensors. During the functional tests, we were not able to disrupt the HART tested sensors by invalid commands. However, these results cannot be extended to other sensors, because of the use of different technology, of different versions of hardware and software and of constructors' specific commands. Furthermore, easy modifications of configuration parameters can cause additional failures. Environmental tests are in progress on HART sensors and will be followed by experiments on field bus sensors. These preliminary investigations and the latest incident initiated by an incorrect computing algorithm of digital switchgear at Ringhals NPP, clearly illustrate that testing and verification programmes for smart equipment must be meticulously designed

  11. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  12. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  13. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  14. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  15. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  16. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires...

  17. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  18. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  19. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  20. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  1. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  2. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  3. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  4. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  5. Zero leakage quantization scheme for biometric verification

    NARCIS (Netherlands)

    Groot, de J.A.; Linnartz, J.P.M.G.

    2011-01-01

    Biometrics gain increasing interest as a solution for many security issues, but privacy risks exist in case we do not protect the stored templates well. This paper presents a new verification scheme, which protects the secrets of the enrolled users. We will show that zero leakage is achieved if

  6. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  7. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  8. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  9. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  10. Intrusion detection sensors

    International Nuclear Information System (INIS)

    Williams, J.D.

    1978-07-01

    Intrusion detection sensors are an integral part of most physical security systems. Under the sponsorship of the U.S. Department of Energy, Office of Safeguards and Security, Sandia Laboratories has conducted a survey of available intrusion detection sensors and has tested a number of different sensors. An overview of these sensors is provided. This overview includes (1) the operating principles of each type of sensor, (2) unique sensor characteristics, (3) desired sensor improvements which must be considered in planning an intrusion detection system, and (4) the site characteristics which affect the performance of both exterior and interior sensors. Techniques which have been developed to evaluate various intrusion detection sensors are also discussed

  11. Hydrogen sensor

    Science.gov (United States)

    Duan, Yixiang; Jia, Quanxi; Cao, Wenqing

    2010-11-23

    A hydrogen sensor for detecting/quantitating hydrogen and hydrogen isotopes includes a sampling line and a microplasma generator that excites hydrogen from a gas sample and produces light emission from excited hydrogen. A power supply provides power to the microplasma generator, and a spectrometer generates an emission spectrum from the light emission. A programmable computer is adapted for determining whether or not the gas sample includes hydrogen, and for quantitating the amount of hydrogen and/or hydrogen isotopes are present in the gas sample.

  12. Application of Wireless Sensor Networks to Automobiles

    Science.gov (United States)

    Tavares, Jorge; Velez, Fernando J.; Ferro, João M.

    2008-01-01

    Some applications of Wireless Sensor Networks (WSNs) to the automobile are identified, and the use of Crossbow MICAz motes operating at 2.4 GHz is considered together with TinyOS support. These WSNs are conceived in order to measure, process and supply to the user diverse types of information during an automobile journey. Examples are acceleration and fuel consumption, identification of incorrect tire pressure, verification of illumination, and evaluation of the vital signals of the driver. A brief survey on WSNs concepts is presented, as well as the way the wireless sensor network itself was developed. Calibration curves were produced which allowed for obtaining luminous intensity and temperature values in the appropriate units. Aspects of the definition of the architecture and the choice/implementation of the protocols are identified. Security aspects are also addressed.

  13. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  14. SENSORS FAULT DIAGNOSIS ALGORITHM DESIGN OF A HYDRAULIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Matej ORAVEC

    2017-06-01

    Full Text Available This article presents the sensors fault diagnosis system design for the hydraulic system, which is based on the group of the three fault estimation filters. These filters are used for estimation of the system states and sensors fault magnitude. Also, this article briefly stated the hydraulic system state control design with integrator, which is important assumption for the fault diagnosis system design. The sensors fault diagnosis system is implemented into the Matlab/Simulink environment and it is verified using the controlled hydraulic system simulation model. Verification of the designed fault diagnosis system is realized by series of experiments, which simulates sensors faults. The results of the experiments are briefly presented in the last part of this article.

  15. Pioneer Venus Star Sensor. [active despin control application

    Science.gov (United States)

    Gutshall, R. L.; Thomas, G.

    1979-01-01

    The design predictions and orbital performance verification of the solid state star scanner used in the Onboard Attitude Control of the Pioneer Venus Orbiter and Multiprobe are presented. The star sensor extended the scanner use to active despin control, and it differs from previous sensors in solid state detection, redundant electronics for reliability, larger field of view, and large dynamic spin range. The star scanner hardware and design performance based on the ability to predict all noise sources, signal magnitudes, and expected detection probability are discussed. In-flight data collected to verify sensor ground calibration are tabulated and plotted in predicted accuracy curves. It is concluded that the Pioneer Venus Star Sensor has demonstrated predictable star calibration in the range of .1 magnitude uncertainties and usable star catalogs of 100 stars with very high probabilities of detection, which were much better than expected and well within the mission requirements.

  16. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  17. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  18. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  19. Sensors for Entertainment.

    Science.gov (United States)

    Lamberti, Fabrizio; Sanna, Andrea; Rokne, Jon

    2016-07-15

    Sensors are becoming ubiquitous in all areas of science, technology, and society. In this Special Issue on "Sensors for Entertainment", developments in progress and the current state of application scenarios for sensors in the field of entertainment is explored.

  20. Sensors for Entertainment

    OpenAIRE

    Fabrizio Lamberti; Andrea Sanna; Jon Rokne

    2016-01-01

    Sensors are becoming ubiquitous in all areas of science, technology, and society. In this Special Issue on ?Sensors for Entertainment?, developments in progress and the current state of application scenarios for sensors in the field of entertainment is explored.

  1. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Science.gov (United States)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  2. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  3. Wireless sensor platform

    Science.gov (United States)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  4. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  5. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  6. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  7. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  8. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  9. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  10. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  11. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  12. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  13. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  14. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  15. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  16. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  17. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  18. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  19. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  20. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  1. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  2. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  3. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  4. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  5. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  6. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  7. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  8. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  9. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  10. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  11. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  12. A modular optical sensor

    Science.gov (United States)

    Conklin, John Albert

    This dissertation presents the design of a modular, fiber-optic sensor and the results obtained from testing the modular sensor. The modular fiber-optic sensor is constructed in such manner that the sensor diaphragm can be replaced with different configurations to detect numerous physical phenomena. Additionally, different fiber-optic detection systems can be attached to the sensor. Initially, the modular sensor was developed to be used by university of students to investigate realistic optical sensors and detection systems to prepare for advance studies of micro-optical mechanical systems (MOMS). The design accomplishes this by doing two things. First, the design significantly lowers the costs associated with studying optical sensors by modularizing the sensor design. Second, the sensor broadens the number of physical phenomena that students can apply optical sensing techniques to in a fiber optics sensor course. The dissertation is divided into seven chapters covering the historical development of fiber-optic sensors, a theoretical overview of fiber-optic sensors, the design, fabrication, and the testing of the modular sensor developed in the course of this work. Chapter 1 discusses, in detail, how this dissertation is organized and states the purpose of the dissertation. Chapter 2 presents an historical overview of the development of optical fibers, optical pressure sensors, and fibers, optical pressure sensors, and optical microphones. Chapter 3 reviews the theory of multi-fiber optic detection systems, optical microphones, and pressure sensors. Chapter 4 presents the design details of the modular, optical sensor. Chapter 5 delves into how the modular sensor is fabricated and how the detection systems are constructed. Chapter 6 presents the data collected from the microphone and pressure sensor configurations of the modular sensor. Finally, Chapter 7 discusses the data collected and draws conclusions about the design based on the data collected. Chapter 7 also

  13. Remote sensing and geoinformation technologies in support of nuclear non-proliferation and arms control verification regimes

    Energy Technology Data Exchange (ETDEWEB)

    Niemeyer, Irmgard [Forschungszentrum Juelich GmbH, Institut fuer Energie- und Klimaforschung, IEK-6: Nukleare Entsorgung und Reaktorsicherheit (Germany)

    2013-07-01

    A number of international agreements and export control regimes have been concluded in order to reduce the risk and proliferation of weapons of mass destruction. In order to provide confidence that Member States are complying with the agreed commitments, most of the treaties and agreements include verification provisions. Different types of verification measures exist, e.g. cooperative measures; national technical means; technical monitoring or measurement devices placed at or near sites; on-site inspections; intelligence information; open-source information, such as commercial internet data and satellite imagery. The study reviews the technical progress in the field of satellite imaging sensors and explores the recent advances in satellite imagery processing and geoinformation technologies as to the extraction of significant observables and signatures. Moreover, it discusses how satellite data and geoinformation technologies could be used complementary for confirming information gathered from other systems or sources. The study also aims at presenting the legal and political aspects and the cost benefits of using imagery from both national and commercial satellites in the verification procedure. The study concludes that satellite imagery and geoinformation technologies are expected to enhance the verification efficiency and effectiveness.

  14. Equivalent Sensor Radiance Generation and Remote Sensing from Model Parameters. Part 1; Equivalent Sensor Radiance Formulation

    Science.gov (United States)

    Wind, Galina; DaSilva, Arlindo M.; Norris, Peter M.; Platnick, Steven E.

    2013-01-01

    In this paper we describe a general procedure for calculating equivalent sensor radiances from variables output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint the algorithm takes explicit account of the model subgrid variability, in particular its description of the probably density function of total water (vapor and cloud condensate.) The equivalent sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies. We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products.) We focus on clouds and cloud/aerosol interactions, because they are very important to model development and improvement.

  15. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  16. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  18. Physics basis for MFTF-B

    International Nuclear Information System (INIS)

    Baldwin, D.E.; Logan, B.G.; Simonen, T.C.

    1980-01-01

    The physics overview covers the following topics: (1) classical energetics, (2) single-particle adiabaticity and drift effects, (3) equilibrium considerations, (4) low-frequency stability, (5) microstability of plugs and barriers, and (6) electron axial thermal conduction. Operation scenarios and summaries of initial TMX results are also described

  19. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  20. Integrated cryogenic sensors

    International Nuclear Information System (INIS)

    Juanarena, D.B.; Rao, M.G.

    1991-01-01

    Integrated cryogenic pressure-temperature, level-temperature, and flow-temperature sensors have several advantages over the conventional single parameter sensors. Such integrated sensors were not available until recently. Pressure Systems, Inc. (PSI) of Hampton, Virginia, has introduced precalibrated precision cryogenic pressure sensors at the Los Angeles Cryogenic Engineering Conference in 1989. Recently, PSI has successfully completed the development of integrated pressure-temperature and level-temperature sensors for use in the temperature range 1.5-375K. In this paper, performance characteristics of these integrated sensors are presented. Further, the effects of irradiation and magnetic fields on these integrated sensors are also reviewed

  1. EDITORIAL: Humidity sensors Humidity sensors

    Science.gov (United States)

    Regtien, Paul P. L.

    2012-01-01

    produced at relatively low cost. Therefore, they find wide use in lots of applications. However, the method requires a material that possesses some conflicting properties: stable and reproducible relations between air humidity, moisture uptake and a specific property (for instance the length of a hair, the electrical impedance of the material), fast absorption and desorption of the water vapour (to obtain a short response time), small hysteresis, wide range of relative humidity (RH) and temperature-independent output (only responsive to RH). For these reasons, much research is done and is still going on to find suitable materials that combine high performance and low price. In this special feature, three of the four papers report on absorption sensors, all with different focus. Aziz et al describe experiments with newly developed materials. The surface structure is extensively studied, in view of its ability to rapidly absorb water vapour and exhibit a reproducible change in the resistance and capacitance of the device. Sanchez et al employ optical fibres coated with a thin moisture-absorbing layer as a sensitive humidity sensor. They have studied various coating materials and investigated the possibility of using changes in optical properties of the fibre (here the lossy mode resonance) due to a change in humidity of the surrounding air. The third paper, by Weremczuk et al, focuses on a cheap fabrication method for absorption-based humidity sensors. The inkjet technology appears to be suitable for mass fabrication of such sensors, which is demonstrated by extensive measurements of the electrical properties (resistance and capacitance) of the absorbing layers. Moreover, they have developed a model that describes the relation between humidity and the electrical parameters of the moisture-sensitive layer. Despite intensive research, absorption sensors still do not meet the requirements for high accuracy applications. The dew-point temperature method is more appropriate

  2. Continuous and recurrent testing of acoustic emission sensors

    International Nuclear Information System (INIS)

    Sause, Markus G.R.; Schmitt, Stefan; Potstada, Philipp

    2017-01-01

    In many fields of application of acoustic emission, the testing can lead to a lasting change in the sensor characteristics. This can be caused by mechanical damage, thermal stress or use under aggressive environmental conditions. Irrespective of visually testable damages of the sensors, a shift in the spectral sensitivity, a reduction in the absolute sensitivity or a reduction in the signal-to-noise ratio can occur. During the test, this requires a possibility to periodically check the sensors, including the coupling aids used. For recurring testing, recommendations are given in Directive SE 02 ''Verification of acoustic emission sensors and their coupling in the laboratory''. This paper discusses possibilities for continuous monitoring of the sensors during the test and presents an application example for the partly automated recurring testing of acoustic emission sensors using Directive SE 02. For this purpose, a test stand for the supply of the sensors to be tested was constructed and the signal recording and data reduction implemented in freely available software programs. The operating principle is demonstrated using selected case studies. [de

  3. An Embedded Sensor Node Microcontroller with Crypto-Processors.

    Science.gov (United States)

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-04-27

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed.

  4. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  5. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  6. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  7. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  8. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  9. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  10. Qualitative and Quantitative Security Analyses for ZigBee Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender

    methods and techniques in different areas and brings them together to create an efficient verification system. The overall ambition is to provide a wide range of powerful techniques for analyzing models with quantitative and qualitative security information. We stated a new approach that first verifies...... applications, home automation, and traffic control. The challenges for research in this area are due to the unique features of wireless sensor devices such as low processing power and associated low energy. On top of this, wireless sensor networks need secure communication as they operate in open fields...... low level security protocol s in a qualitative manner and guarantees absolute security, and then takes these verified protocols as actions of scenarios to be verified in a quantitative manner. Working on the emerging ZigBee wireless sensor networks, we used probabilistic verification that can return...

  11. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  12. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  13. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  14. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  15. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  16. Turf Conversion Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings as a result of water conservation measures (WCMs) in energy performance contracts associated with converting turfgrass or other water-intensive plantings to water-wise and sustainable landscapes. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  17. Outdoor Irrigation Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.

  18. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  19. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  20. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  1. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  2. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  3. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  4. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  5. Application Of FA Sensor 2

    International Nuclear Information System (INIS)

    Park, Seon Ho

    1993-03-01

    This book introduces FA sensor from basic to making system, which includes light sensor like photo diode and photo transistor, photo electricity sensor, CCD type image sensor, MOS type image sensor, color sensor, cds cell, and optical fiber scope. It also deals with direct election position sensor such as proximity switch, differential motion, linear scale of photo electricity type, and magnet scale, rotary sensor with summary of rotary encoder, rotary encoder types and applications, flow sensor, and sensing technology.

  6. Resonance-induced sensitivity enhancement method for conductivity sensors

    Science.gov (United States)

    Tai, Yu-Chong (Inventor); Shih, Chi-yuan (Inventor); Li, Wei (Inventor); Zheng, Siyang (Inventor)

    2009-01-01

    Methods and systems for improving the sensitivity of a variety of conductivity sensing devices, in particular capacitively-coupled contactless conductivity detectors. A parallel inductor is added to the conductivity sensor. The sensor with the parallel inductor is operated at a resonant frequency of the equivalent circuit model. At the resonant frequency, parasitic capacitances that are either in series or in parallel with the conductance (and possibly a series resistance) is substantially removed from the equivalent circuit, leaving a purely resistive impedance. An appreciably higher sensor sensitivity results. Experimental verification shows that sensitivity improvements of the order of 10,000-fold are possible. Examples of detecting particulates with high precision by application of the apparatus and methods of operation are described.

  7. Sensors an introductory course

    CERN Document Server

    Kalantar-zadeh, Kourosh

    2013-01-01

    Sensors: An Introductory Course provides an essential reference on the fundamentals of sensors. The book is designed to help readers in developing skills and the understanding required in order to implement a wide range of sensors that are commonly used in our daily lives. This book covers the basic concepts in the sensors field, including definitions and terminologies. The physical sensing effects are described, and devices which utilize these effects are presented. The most frequently used organic and inorganic sensors are introduced and the techniques for implementing them are discussed. This book: Provides a comprehensive representation of the most common sensors and can be used as a reference in relevant fields Presents learning materials in a concise and easy to understand manner Includes examples of how sensors are incorporated in real life measurements Contains detailed figures and schematics to assist in understanding the sensor performance Sensors: An Introductory Course is ideal for university stu...

  8. Coupled wave sensor technology

    International Nuclear Information System (INIS)

    Maki, M.C.

    1988-01-01

    Buried line guided radar sensors have been used successfully for a number of years to provide perimeter security for high value resources. This paper introduces a new complementary sensor advancement at Computing Devices termed 'coupled wave device technology' (CWD). It provides many of the inherent advantages of leakey cable sensors, such as terrain-following and the ability to discriminate between humans and small animals. It also is able to provide a high or wide detection zone, and allows the sensor to be mounted aerially and adjacent to a wall or fence. Several alternative sensors have been developed which include a single-line sensor, a dual-line hybrid sensor that combines the elements of ported coax and CWD technology, and a rapid-deployment portable sensor for temporary or mobile applications. A description of the technology, the sensors, and their characteristics is provided

  9. Smart Optoelectronic Sensors and Intelligent Sensor Systems

    Directory of Open Access Journals (Sweden)

    Sergey Y. YURISH

    2012-03-01

    Full Text Available Light-to-frequency converters are widely used in various optoelectronic sensor systems. However, a further frequency-to-digital conversion is a bottleneck in such systems due to a broad frequency range of light-to-frequency converters’ outputs. This paper describes an effective OEM design approach, which can be used for smart and intelligent sensor systems design. The design is based on novel, multifunctional integrated circuit of Universal Sensors & Transducers Interface especially designed for such sensor applications. Experimental results have confirmed an efficiency of this approach and high metrological performances.

  10. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  11. Towards Sensor Database Systems

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Gehrke, Johannes; Seshadri, Praveen

    2001-01-01

    . These systems lack flexibility because data is extracted in a predefined way; also, they do not scale to a large number of devices because large volumes of raw data are transferred regardless of the queries that are submitted. In our new concept of sensor database system, queries dictate which data is extracted...... from the sensors. In this paper, we define the concept of sensor databases mixing stored data represented as relations and sensor data represented as time series. Each long-running query formulated over a sensor database defines a persistent view, which is maintained during a given time interval. We...... also describe the design and implementation of the COUGAR sensor database system....

  12. Change Detection with Polarimetric SAR Imagery for Nuclear Verification

    International Nuclear Information System (INIS)

    Canty, M.

    2015-01-01

    This paper investigates the application of multivariate statistical change detection with high-resolution polarimetric SAR imagery acquired from commercial satellite platforms for observation and verification of nuclear activities. A prototype software tool comprising a processing chain starting from single look complex (SLC) multitemporal data through to change detection maps is presented. Multivariate change detection algorithms applied to polarimetric SAR data are not common. This is because, up until recently, not many researchers or practitioners have had access to polarimetric data. However with the advent of several spaceborne polarimetric SAR instruments such as the Japanese ALOS, the Canadian Radarsat-2, the German TerraSAR-X, the Italian COSMO-SkyMed missions and the European Sentinal SAR platform, the situation has greatly improved. There is now a rich source of weather-independent satellite radar data which can be exploited for Nuclear Safeguards purposes. The method will also work for univariate data, that is, it is also applicable to scalar or single polarimetric SAR data. The change detection procedure investigated here exploits the complex Wishart distribution of dual and quad polarimetric imagery in look-averaged covariance matrix format in order to define a per-pixel change/no-change hypothesis test. It includes approximations for the probability distribution of the test statistic, and so permits quantitative significance levels to be quoted for change pixels. The method has been demonstrated previously with polarimetric images from the airborne EMISAR sensor, but is applied here for the first time to satellite platforms. In addition, an improved multivariate method is used to estimate the so-called equivalent number of looks (ENL), which is a critical parameter of the hypothesis test. (author)

  13. Automated Verification of Spatial Resolution in Remotely Sensed Imagery

    Science.gov (United States)

    Davis, Bruce; Ryan, Robert; Holekamp, Kara; Vaughn, Ronald

    2011-01-01

    Image spatial resolution characteristics can vary widely among sources. In the case of aerial-based imaging systems, the image spatial resolution characteristics can even vary between acquisitions. In these systems, aircraft altitude, speed, and sensor look angle all affect image spatial resolution. Image spatial resolution needs to be verified with estimators that include the ground sample distance (GSD), the modulation transfer function (MTF), and the relative edge response (RER), all of which are key components of image quality, along with signal-to-noise ratio (SNR) and dynamic range. Knowledge of spatial resolution parameters is important to determine if features of interest are distinguishable in imagery or associated products, and to develop image restoration algorithms. An automated Spatial Resolution Verification Tool (SRVT) was developed to rapidly determine the spatial resolution characteristics of remotely sensed aerial and satellite imagery. Most current methods for assessing spatial resolution characteristics of imagery rely on pre-deployed engineered targets and are performed only at selected times within preselected scenes. The SRVT addresses these insufficiencies by finding uniform, high-contrast edges from urban scenes and then using these edges to determine standard estimators of spatial resolution, such as the MTF and the RER. The SRVT was developed using the MATLAB programming language and environment. This automated software algorithm assesses every image in an acquired data set, using edges found within each image, and in many cases eliminating the need for dedicated edge targets. The SRVT automatically identifies high-contrast, uniform edges and calculates the MTF and RER of each image, and when possible, within sections of an image, so that the variation of spatial resolution characteristics across the image can be analyzed. The automated algorithm is capable of quickly verifying the spatial resolution quality of all images within a data

  14. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  15. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  16. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  17. Advanced Technologies for Design Information Verification

    International Nuclear Information System (INIS)

    Watkins, Michael L.; Sheen, David M.; Rose, Joseph L.; Cumblidge, Stephen E.

    2009-01-01

    This paper discusses several technologies that have the potential to enhance facilities design verification. These approaches have shown promise in addressing the challenges associated with the verification of sub-component geometry and material composition for structures that are not directly accessible for physical inspection. A simple example is a pipe that extends into or through a wall or foundation. Both advanced electromagnetic and acoustic modalities will be discussed. These include advanced radar imaging, transient thermographic imaging, and guided acoustic wave imaging. Examples of current applications are provided. The basic principles and mechanisms of these inspection techniques are presented along with the salient practical features, advantages, and disadvantages of each technique. Other important considerations, such as component geometries, materials, and degree of access are also treated. The importance of, and strategies for, developing valid inspection models are also discussed. Beyond these basic technology adaptation and evaluation issues, important user interface considerations are outlined, along with approaches to quantify the overall performance reliability of the various inspection methods.

  18. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  19. Automatic quality verification of the TV sets

    Science.gov (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag

    2010-01-01

    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  20. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  1. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  2. Clinical verification in homeopathy and allergic conditions.

    Science.gov (United States)

    Van Wassenhoven, Michel

    2013-01-01

    The literature on clinical research in allergic conditions treated with homeopathy includes a meta-analysis of randomised controlled trials (RCT) for hay fever with positive conclusions and two positive RCTs in asthma. Cohort surveys using validated Quality of Life questionnaires have shown improvement in asthma in children, general allergic conditions and skin diseases. Economic surveys have shown positive results in eczema, allergy, seasonal allergic rhinitis, asthma, food allergy and chronic allergic rhinitis. This paper reports clinical verification of homeopathic symptoms in all patients and especially in various allergic conditions in my own primary care practice. For preventive treatments in hay fever patients, Arsenicum album was the most effective homeopathic medicine followed by Nux vomica, Pulsatilla pratensis, Gelsemium, Sarsaparilla, Silicea and Natrum muriaticum. For asthma patients, Arsenicum iodatum appeared most effective, followed by Lachesis, Calcarea arsenicosa, Carbo vegetabilis and Silicea. For eczema and urticaria, Mezereum was most effective, followed by Lycopodium, Sepia, Arsenicum iodatum, Calcarea carbonica and Psorinum. The choice of homeopathic medicine depends on the presence of other associated symptoms and 'constitutional' features. Repertories should be updated by including results of such clinical verifications of homeopathic prescribing symptoms. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  3. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  4. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  5. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  6. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  7. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  8. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...

  9. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  10. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  11. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  12. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  13. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  14. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  15. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  16. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  17. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  18. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  19. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  20. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.