Sample records for verification test dvt

  1. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug


    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.


    Energy Technology Data Exchange (ETDEWEB)

    Moran, B


    We present analytic solutions to two test problems that can be used to check the hydrodynamic implementation in computer codes designed to calculate the propagation of shocks in spherically convergent geometry. Our analysis is restricted to fluid materials with constant bulk modulus. In the first problem we present the exact initial acceleration and pressure gradient at the outer surface of a sphere subjected to an exponentially decaying pressure of the form P(t) = P{sub 0}e{sup -at}. We show that finely-zoned hydro-code simulations are in good agreement with our analytic solution. In the second problem we discuss the implosions of incompressible spherical fluid shells and we present the radial pressure profile across the shell thickness. We also discuss a semi-analytic solution to the time-evolution of a nearly spherical shell with arbitrary but small initial 3-dimensional (3-D) perturbations on its inner and outer surfaces.

  3. Power Reactant Storage Assembly (PRSA) (Space Shuttle). PRSA hydrogen and oxygen DVT tank refurbishment (United States)


    The Power Reactant Storage Assembly (PRSA) liquid hydrogen Development Verification Test (H2 DVT) tank assembly (Beech Aircraft Corporation P/N 15548-0116-1, S/N 07399000SHT0001) and liquid oxygen (O2) DVT tank assembly (Beech Aircraft Corporation P/N 15548-0115-1, S/N 07399000SXT0001) were refurbished by Ball Electro-Optics and Cryogenics Division to provide NASA JSC, Propulsion and Power Division, the capability of performing engineering tests. The refurbishments incorporated the latest flight configuration hardware and avionics changes necessary to make the tanks function like flight articles. This final report summarizes these refurbishment activities. Also included are up-to-date records of the pressure time and cycle histories.

  4. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack,


    Energy Technology Data Exchange (ETDEWEB)

    Aleman, S


    The PORFLOW software package is a comprehensive mathematical model for simulation of multi-phase fluid flow, heat transfer and mass transport in variably saturated porous and fractured media. PORFLOW can simulate transient or steady-state problems in Cartesian or cylindrical geometry. The porous medium may be anisotropic and heterogeneous and may contain discrete fractures or boreholes with the porous matrix. The theoretical models within the code provide a unified treatment of concepts relevant to fluid flow and transport. The main features of PORFLOW that are relevant to Performance Assessment modeling at the Savannah River National Laboratory (SRNL) include variably saturated flow and transport of parent and progeny radionuclides. This document involves testing a relevant sample of problems in PORFLOW and comparing the outcome of the simulations to analytical solutions or other commercial codes. The testing consists of the following four groups. Group 1: Groundwater Flow; Group 2: Contaminant Transport; Group 3: Numerical Dispersion; and Group 4: Keyword Commands.

  6. Infrared scanner concept verification test report (United States)

    Bachtel, F. D.


    The test results from a concept verification test conducted to assess the use of an infrared scanner as a remote temperature sensing device for the space shuttle program are presented. The temperature and geometric resolution limits, atmospheric attenuation effects including conditions with fog and rain, and the problem of surface emissivity variations are included. It is concluded that the basic concept of using an infrared scanner to determine near freezing surface temperatures is feasible. The major problem identified is concerned with infrared reflections which result in significant errors if not controlled. Action taken to manage these errors result in design and operational constraints to control the viewing angle and surface emissivity.

  7. Design, analysis, and test verification of advanced encapsulation system (United States)

    Garcia, A.; Minning, C.


    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  8. Fracture mechanics life analytical methods verification testing (United States)

    Favenesi, J. A.; Clemons, T. G.; Riddell, W. T.; Ingraffea, A. R.; Wawrzynek, P. A.


    The objective was to evaluate NASCRAC (trademark) version 2.0, a second generation fracture analysis code, for verification and validity. NASCRAC was evaluated using a combination of comparisons to the literature, closed-form solutions, numerical analyses, and tests. Several limitations and minor errors were detected. Additionally, a number of major flaws were discovered. These major flaws were generally due to application of a specific method or theory, not due to programming logic. Results are presented for the following program capabilities: K versus a, J versus a, crack opening area, life calculation due to fatigue crack growth, tolerable crack size, proof test logic, tearing instability, creep crack growth, crack transitioning, crack retardation due to overloads, and elastic-plastic stress redistribution. It is concluded that the code is an acceptable fracture tool for K solutions of simplified geometries, for a limited number of J and crack opening area solutions, and for fatigue crack propagation with the Paris equation and constant amplitude loads when the Paris equation is applicable.

  9. Orbit attitude processor. STS-1 bench program verification test plan (United States)

    Mcclain, C. R.


    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  10. [News in the work-up of deep vein thrombosis (DVT)]. (United States)

    Wautrecht, J-C


    Deep vein thrombosis (DVT) is a component of venous thromboembolism (VTE), the other being pulmonary embolism (PE). Its incidence is 1 to 2/1.000/year and nearly 1/100/year after 80 years. The major complication of DVT is PE which occurs in about 1/3 of cases, is often asymptomatic but can be fatal. Another common complication, occurring in 20-50 % of cases is the post-thrombotic syndrome (PTS) which is likely to alter the quality of life. Several issues remain unanswered when considering DVT. The optimal management of distal DVT versus proximal DVT is not well codified. The diagnostic approach to DVT is essential : it is based on the estimation of clinical probability, the possible use of D-dimer test and compression ultrasonography. The new direct oral anticoagulants (NOACs) have been proven effective in the phase 3 studies but when to use them and which to choose in the real life ? Wearing compression stockings to prevent the SPT is recommended: what is the definition of compression stockings and is there some evidence of their efficacy ? The purpose of this article is to provide some useful information to primary care physicians to address a DVT.

  11. Prediction of deep vein thrombosis after elective hip replacement surgery by preoperative clinical and haemostatic variables: the ECAT DVT Study. European Concerted Action on Thrombosis. (United States)

    Lowe, G D; Haverkate, F; Thompson, S G; Turner, R M; Bertina, R M; Turpie, A G; Mannucci, P M


    The European Concerted Action on Thrombosis (ECAT) DVT Study was a collaborative study of preoperative haemostatic tests in prediction of DVT (diagnosed by routine bilateral venography) after elective hip replacement. 480 patients were recruited in 11 centres across Europe. Clinical risk factors were assessed, and stored citrated plasma aliquots were centrally assayed for 29 haemostatic factors according to the ECAT methodology. 120 (32%) of 375 evaluable patients had DVT, and 41 (11%) had proximal DVT. Among clinical variables, DVT was significantly associated with increased age, obesity, and possibly non-use of stockings. Of the 29 haemostatic factors, mean preoperative levels were significantly higher in patients with subsequent DVT (on univariate analyses) for factor VIII activity, prothrombin fragment F1+2, thrombin-antithrombin complexes, and fibrin D-dimer; and significantly lower for APTT and APC sensitivity ratio. Factor V Leiden was also associated with DVT. Most of these variables were also associated with age, while D-dimer was higher in patients with varicose veins. On multivariate analyses including clinical variables, only a shorter APTT (locally but not centrally performed) and APC resistance showed a statistically significant association with DVT. We conclude that (a) DVT is common after elective hip replacement despite prophylaxis; (b) the study provides some evidence that DVT is associated with a preoperative hypercoaguable state; and (c) preoperative haemostatic tests do not add significantly to prediction of DVT from clinical variables, with the possible exception of APC resistance.

  12. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G


    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  13. SRS Software Verification Pre Operational and Startup Test

    Energy Technology Data Exchange (ETDEWEB)

    HILL, L.F.


    This document defines testing for the software used to control the Sodium Removal System (SRS). The testing is conducted off-line from the. physical plant by using a simulator built-in to the software. This provides verification of proper software operation prior to performing the operational acceptance tests with the actual plant hardware.

  14. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection. (United States)


    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  15. A profile of lower-limb deep-vein thrombosis: the hidden menace of below-knee DVT

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, G.W. [Department of Clinical Radiology, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom); Reid, J.H. [Department of Clinical Radiology, Borders General Hospital, Melrose (United Kingdom); Simpson, A.J. [Department of Respiratory Medicine, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom); Murchison, J.T. [Department of Clinical Radiology, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom)]. E-mail:


    Aims: To describe the anatomical site and laterality of deep-vein thrombosis (DVT) in symptomatic patients using contrast venography (CV), and to assess age, sex distribution, and accuracy of pre-test clinical suspicion of DVT. Methods: One thousand, five hundred and seventy-two patients undergoing CV because of a clinical suspicion of DVT at a large teaching hospital from October 1995 to March 2003 were prospectively studied. Results: Thrombi were demonstrated in 511 (32.5%) of all CV studies. Isolated, below-knee thrombi were identified in 29.4% of positive studies. There was a left-sided predominance of DVT (ratio 1.24:1) that was most evident in the elderly and in more proximal veins. Conclusion: Almost a third of positive cases were shown to be isolated, below-knee thrombi. These are thrombi that are more difficult to detect by non-invasive means. A left-sided predominance of DVT is evident.

  16. Hubble Space Telescope-Space Shuttle interface dynamic verification test (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna


    A test program has been developed for the interface between the Space Shuttle Orbiter and the Hubble Space Telescope which couples a standard modal test for a simple suspended structure with a novel, 'interface verification' test. While the free-free modal test is used to verify the high loads generating structural modes due to the interaction of internal components of the structure with the rest of the structure, the interface verification test verifies the character of the high-loading generating modes in which the structure reacts against the booster interface. The novel method excites the structure at a single payload-booster interface DOF, while all other interfaces are left free to move.

  17. Grid Modernization Laboratory Consortium - Testing and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob; Kim, Tom; Ellis, Abraham


    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  18. Testing and Formal Verification of Logarithmic Function Design (United States)

    Agarwal, Sanjeev; Bhuria, Indu


    Logarithmic function has been designed on basis of multiplicative normalization and then its testing is been done using tetraMAX. It is observed that 7050 possible faults can be there in the design and tetraMAX ATPG can provide test coverage of 99.29%. Using design compiler .db file is generated which is used for functional verification of the design with respect to RTL design. Compare points are shown by cone views of the design.

  19. Clinical verification of a unilateral otolith test (United States)

    Wetzig, J.; Hofstetter-Degen, K.; Maurer, J.; von Baumgarten, R. J.

    In a previous study 13 we reported promising results for a new test to differentiate in vivo unilateral otolith functions. That study pointed to a need for further validation on known pathological cases. In this presentation we will detail the results gathered on a group of clinically verified vestibular defectives (verum) and a normal (control) group. The subjects in the verum group were former patients of the ENT clinic of the university hospital. These subjects had usually suffered from neurinoma of the VIIth cranial nerve or inner ear infections. All had required surgical intervention including removal of the vestibular system. The patients were contacted usually two or more years postoperatively. A group of students from the pre- and clinical phase of medical training served as control. Both groups were subjected to standardized clinical tests. These tests served to reconfirm the intra- or postoperative diagnosis of unilateral vestibular loss in the verum group. In the control group they had to establish the normalcy of the responses of the vestibular system. Both groups then underwent testing on our exccentric rotary chair in the manner described before 13. Preliminary results of the trials indicate that this test may indeed for the first time offer a chance to look at isolated otolith apparati in vivo.

  20. Dual Mode Inverter Control Test Verification

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, J.M.


    Permanent Magnet Motors with either sinusoidal back emf (permanent magnet synchronous motor [PMSM]) or trapezoidal back emf (brushless dc motor [BDCM]) do not have the ability to alter the air gap flux density (field weakening). Since the back emf increases with speed, the system must be designed to operate with the voltage obtained at its highest speed. Oak Ridge National Laboratory's (ORNL) Power Electronics and Electric Machinery Research Center (PEEMRC) has developed a dual mode inverter controller (DMIC) that overcomes this disadvantage. This report summarizes the results of tests to verify its operation. The standard PEEMRC 75 kW hard-switched inverter was modified to implement the field weakening procedure (silicon controlled rectifier enabled phase advance). A 49.5 hp motor rated at 2800 rpm was derated to a base of 400 rpm and 7.5 hp. The load developed by a Kahn Industries hydraulic dynamometer, was measured with a MCRT9-02TS Himmelstein and Company torque meter. At the base conditions a current of 212 amperes produced the 7.5 hp. Tests were run at 400, 1215, and 2424 rpm. In each run, the current was no greater than 214 amperes. The horsepower obtained in the three runs were 7.5, 9.3, and 8.12. These results verified the basic operation of the DMIC in producing a Constant Power Speed Ratios (CPSR) of six.

  1. Workgroup for Hydraulic laboratory Testing and Verification of Hydroacoustic Instrumentation (United States)

    Fulford, Janice M.; Armstrong, Brandy N.; Thibodeaux, Kirk G.


    An international workgroup was recently formed for hydraulic laboratory testing and verification of hydroacoustic instrumentation used for water velocity measurements. The activities of the workgroup have included one face to face meeting, conference calls and an inter-laboratory exchange of two acoustic meters among participating laboratories. Good agreement was found among four laboratories at higher tow speeds and poorer agreement at the lowest tow speed.


    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  3. The concept verification testing of materials science payloads (United States)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.


    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  4. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)


    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  5. Standard practices for verification of speed for material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 These practices cover procedures and requirements for the calibration and verification of testing machine speed by means of standard calibration devices. This practice is not intended to be complete purchase specifications for testing machines. 1.2 These practices apply to the verification of the speed application and measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, setting, etc. In all cases the buyer/owner/user must designate the speed-measuring system(s) to be verified. 1.3 These practices give guidance, recommendations, and examples, specific to electro-mechanical testing machines. The practice may also be used to verify actuator speed for hydraulic testing machines. 1.4 This standard cannot be used to verify cycle counting or frequency related to cyclic fatigue testing applications. 1.5 Since conversion factors are not required in this practice, either SI units (mm/min), or English [in/min], can be used as the standa...

  6. NASA's Evolutionary Xenon Thruster (NEXT) Component Verification Testing (United States)

    Herman, Daniel A.; Pinero, Luis R.; Sovey, James S.


    Component testing is a critical facet of the comprehensive thruster life validation strategy devised by the NASA s Evolutionary Xenon Thruster (NEXT) program. Component testing to-date has consisted of long-duration high voltage propellant isolator and high-cycle heater life validation testing. The high voltage propellant isolator, a heritage design, will be operated under different environmental condition in the NEXT ion thruster requiring verification testing. The life test of two NEXT isolators was initiated with comparable voltage and pressure conditions with a higher temperature than measured for the NEXT prototype-model thruster. To date the NEXT isolators have accumulated 18,300 h of operation. Measurements indicate a negligible increase in leakage current over the testing duration to date. NEXT 1/2 in. heaters, whose manufacturing and control processes have heritage, were selected for verification testing based upon the change in physical dimensions resulting in a higher operating voltage as well as potential differences in thermal environment. The heater fabrication processes, developed for the International Space Station (ISS) plasma contactor hollow cathode assembly, were utilized with modification of heater dimensions to accommodate a larger cathode. Cyclic testing of five 1/22 in. diameter heaters was initiated to validate these modified fabrication processes while retaining high reliability heaters. To date two of the heaters have been cycled to 10,000 cycles and suspended to preserve hardware. Three of the heaters have been cycled to failure giving a B10 life of 12,615 cycles, approximately 6,000 more cycles than the established qualification B10 life of the ISS plasma contactor heaters.

  7. DVT presentations to an emergency department: a study of guideline based care and decision making

    LENUS (Irish Health Repository)

    Lillis, D


    Pre-test probability scoring and blood tests for deep venous thrombosis (DVT) assessment are sensitive, but not specific leading to increased demands on radiology services. Three hundred and eighty-five patients presenting to an Emergency Department (ED), with suspected DVT, were studied to explore our actual work-up of patients with possible DVT relating to risk stratification, further investigation and follow up. Of the 205 patients with an initially negative scan, 36 (17.6%) were brought for review to the ED Consultant clinic. Thirty-four (16.6%) patients underwent repeat compression ultrasound with 5 (2.4%) demonstrating a DVT on the second scan. Repeat compression ultrasound scans were performed on 34 (16.6%) patients with an initially negative scan, with essentially the same diagnostic yield as other larger studies where 100% of such patients had repeat scanning. Where there is ongoing concern, repeat above-knee compression ultrasound within one week will pick up a small number of deep venous thromboses.

  8. Verification Challenges of Dynamic Testing of Space Flight Hardware (United States)

    Winnitoy, Susan


    The Six Degree-of-Freedom Dynamic Test System (SDTS) is a test facility at the National Aeronautics and Space Administration (NASA) Johnson Space Center in Houston, Texas for performing dynamic verification of space structures and hardware. Some examples of past and current tests include the verification of on-orbit robotic inspection systems, space vehicle assembly procedures and docking/berthing systems. The facility is able to integrate a dynamic simulation of on-orbit spacecraft mating or demating using flight-like mechanical interface hardware. A force moment sensor is utilized for input to the simulation during the contact phase, thus simulating the contact dynamics. While the verification of flight hardware presents many unique challenges, one particular area of interest is with respect to the use of external measurement systems to ensure accurate feedback of dynamic contact. There are many commercial off-the-shelf (COTS) measurement systems available on the market, and the test facility measurement systems have evolved over time to include two separate COTS systems. The first system incorporates infra-red sensing cameras, while the second system employs a laser interferometer to determine position and orientation data. The specific technical challenges with the measurement systems in a large dynamic environment include changing thermal and humidity levels, operational area and measurement volume, dynamic tracking, and data synchronization. The facility is located in an expansive high-bay area that is occasionally exposed to outside temperature when large retractable doors at each end of the building are opened. The laser interferometer system, in particular, is vulnerable to the environmental changes in the building. The operational area of the test facility itself is sizeable, ranging from seven meters wide and five meters deep to as much as seven meters high. Both facility measurement systems have desirable measurement volumes and the accuracies vary

  9. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo


    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  10. Flight testing vehicles for verification and validation of hypersonics technology (United States)

    Sacher, Peter W.


    Hypersonics technology has obtained renewed interest since various concepts for future completely reusable Space Transportation Systems (STS) using airbreathing propulsion for the parts of atmospheric flight have been proposed in different countries (e.g. US, CIS, Japan, France, Germany, and UK). To cover major developments in those countries, AGARD FDP has formed the Working Group 18 on 'Hypersonic Experimental and Computational Capabilities - Improvement and Validation'. Of major importance for the proof of feasibility for all these concepts is the definition of an overall convincing philosophy for a 'hypersonics technology development and verification concept' using ground simulation facilities (both experimental and numerical) and flight testing vehicles. Flying at hypersonic Mach numbers using airbreathing propulsion requires highly sophisticated design tools to provide reliable prediction of thrust minus aerodynamic drag to accelerate the vehicle during ascent. Using these design tools, existing uncertainties have to be minimized by a carefully performed code validation process. To a large degree the database required for this validation cannot be obtained on ground. In addition thermal loads due to hypersonic flow have to be predicted accurately by aerothermodynamic flow codes to provide the inputs needed to decide on materials and structures. Heat management for hypersonic flight vehicles is one of the key-issues for any kind of successful flight demonstration. This paper identifies and discusses the role of flight testing during the verification and validation process of advanced hypersonic technology needed for flight in the atmosphere with hypersonic Mach numbers using airbreathing propulsion systems both for weapons and space transportation systems.

  11. Battery Technology Life Verification Test Manual Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Jon P. Christophersen


    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  12. In-Space Engine (ISE-100) Development - Design Verification Test (United States)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad


    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  13. Active Thermal Control Experiments for LISA Ground Verification Testing (United States)

    Higuchi, Sei; DeBra, Daniel B.


    The primary mission goal of LISA is detecting gravitational waves. LISA uses laser metrology to measure the distance between proof masses in three identical spacecrafts. The total acceleration disturbance to each proof mass is required to be below 3 × 10-15 m/s2√Hz . Optical path length variations on each optical bench must be kept below 40 pm/√Hz over 1 Hz to 0.1 mHz. Thermal variations due to, for example, solar radiation or temperature gradients across the proof mass housing will distort the spacecraft causing changes in the mass attraction and sensor location. We have developed a thermal control system developed for the LISA gravitational reference sensor (GRS) ground verification testing which provides thermal stability better than 1 mK/√Hz to f control for the LISA spacecraft to compensate solar irradiation. Thermally stable environment is very demanded for LISA performance verification. In a lab environment specifications can be met with considerable amount of insulation and thermal mass. For spacecraft, the very limited thermal mass calls for an active control system which can meet disturbance rejection and stability requirements simultaneously in the presence of long time delay. A simple proportional plus integral control law presently provides approximately 1 mK/√Hz of thermal stability for over 80 hours. Continuing development of a model predictive feed-forward algorithm will extend performance to below 1 mK/√Hz at f < 1 mHz and lower.

  14. Verification Testing: Meet User Needs Figure of Merit (United States)

    Kelly, Bryan W.; Welch, Bryan W.


    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible

  15. Software Testing and Verification in Climate Model Development (United States)

    Clune, Thomas L.; Rood, RIchard B.


    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  16. Testing Equation Method Modification for Demanding Energy Measurements Verification

    Directory of Open Access Journals (Sweden)

    Elena Kochneva


    Full Text Available The paper is devoted to the mathematical approaches of the measurements received from Automatic Meter Reading Systems verification. Reliability of metering data can be improved by application of the new issue named Energy Flow Problem. The paper considers demanding energy measurements verification method based on verification expressions groups analysis. Bad data detection and estimates accuracy calculation is presented using the Automatic Meter Reading system data from the Russian power system fragment.

  17. Computer Science and Technology: Validation, Verification, and Testing for the Individual Programmer. (United States)

    Branstad, Martha A.; And Others

    Guidelines are given for program testing and verification to ensure quality software for the programmer working alone in a computing environment with limited resources. The emphasis is on verification as an integral part of the software development. Guidance includes developing and planning testing as well as the application of other verification…

  18. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  19. A standardized framework for the validation and verification of clinical molecular genetic tests.

    NARCIS (Netherlands)

    Mattocks, C.J.; Morris, M.A.; Matthijs, G.; Swinnen, E.; Corveleyn, A.; Dequeker, E.; Muller, C.R.; Pratt, V.; Wallace, A.


    The validation and verification of laboratory methods and procedures before their use in clinical testing is essential for providing a safe and useful service to clinicians and patients. This paper outlines the principles of validation and verification in the context of clinical human molecular

  20. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory


    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  1. A standardized framework for the validation and verification of clinical molecular genetic tests. (United States)

    Mattocks, Christopher J; Morris, Michael A; Matthijs, Gert; Swinnen, Elfriede; Corveleyn, Anniek; Dequeker, Els; Müller, Clemens R; Pratt, Victoria; Wallace, Andrew


    The validation and verification of laboratory methods and procedures before their use in clinical testing is essential for providing a safe and useful service to clinicians and patients. This paper outlines the principles of validation and verification in the context of clinical human molecular genetic testing. We describe implementation processes, types of tests and their key validation components, and suggest some relevant statistical approaches that can be used by individual laboratories to ensure that tests are conducted to defined standards.

  2. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  3. The JPSS Ground Project Algorithm Verification, Test and Evaluation System (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.


    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  4. A robust method using propensity score stratification for correcting verification bias for binary tests. (United States)

    He, Hua; McDermott, Michael P


    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified.

  5. Exploring the Causes of Distal Volcano-Tectonic (dVT) Seismicity Using Hydrothermal Modeling (United States)

    Ingebritsen, S.; Coulon, C.; Hsieh, P. A.; White, R. A.; Lowenstern, J. B.


    Based on observation of 111 volcanic eruptions, White and McCausland (JVGR, 2016) found that distal volcano-tectonic (dVT) seismicity typically preceded eruption at long-dormant volcanoes by days to years. They hypothesized that precursory dVT seismicity reflects magma-induced fluid-pressure pulses that intersect critically stressed faults. We explored this idea using the USGS HYDROTHERM model, an open-source magmatic-hydrothermal code that simulates multiphase fluid and heat transport over the temperature range 0 to 1200oC. We examined fluid pressure changes caused by a small (0.04 km3) intrusion into host rock, and explored the effects of coordinate systems (Cartesian vs. radial), magma devolatilization rates (0-15 kg/s), and intrusion depths (5 and 7.5 km, above and below the brittle-ductile transition). Magma and host-rock permeabilities were key controlling parameters and we tested a wide range of permeability (k) and permeability anisotropies (kh/kv), including k constant, k(depth), k(T), and k(depth,T,P) distributions. We examined a total of 1500 realizations to explore the parameter space. Propagation of pressure changes (∆P≥0.1 bars) to the mean dVT location (6 km lateral distance, 6 km depth) was favored by focused fluid flow (i.e. Cartesian geometries), high devolatilization rates, and permeabilities similar to those found in geothermal reservoirs (k 10-16 to 10-13 m2). In Cartesian coordinates we found cases of ∆P ≥ 0.1 bars for every permeability in the range 10-16 to 10-13 m2, whereas in radial coordinates with no devolatilization, ∆P < 0.1 bars occurred for all permeabilities. Changes in distal fluid pressure transpired before proximal changes given modest anisotropies (kh/kv 10-100) typical of layered volcanic rocks. Invoking k(depth,T,P) and high, sustained devolatilization rates caused large dynamic fluctuations in k and P in the near-magma environment but had little effect on pressure changes at the distal dVT location. Intrusion below

  6. Developing Reading and Listening Comprehension Tests Based on the Sentence Verification Technique (SVT). (United States)

    Royer, James M.


    Describes a team-based approach for creating Sentence Verification Technique (SVT) tests, a development procedure that allows teachers and other school personnel to develop comprehension tests from curriculum materials in use in their schools. Finds that if tests are based on materials that are appropriate for the population to be tested, the…

  7. Utilization of DVT Prophylaxis in non ICU Hospitalized Patients

    Directory of Open Access Journals (Sweden)

    Sukhendu Shekhar Bhowmik


    Full Text Available The aim of this study was to assess all aspects of the routine clinical practices of DVT prophylaxis followed in the non ICU hospitalised (both medical & surgical patients in various ward of multispecialty, tertiary care hospital in India (Kolkata. All patients admitted in AMRI hospital in general ward were screened for inclusion and exclusion criteria. Those patients meeting the inclusion criteria were assessed for the risk factor and appropriateness of DVT prophylaxis on third day of hospitalization during August-2009 to April-2010. Further assessment was done to see the light on the thromboprophylaxis practices according to the recommendations given by the American College of Chest Physician (ACCP in the 8th ACCP Conference on Antithrombotic and Thrombolytic Therapy, (June-2008. Total 1938 patients were enrolled of which 267 patients (13.78% were excluded (did not meet inclusion criteria and 1671 patients (86.22% were included. From included patients 331(19.8% received any form of prophylaxis and majority of patients 80.2% did not receive any form of prophylaxis. Appropriateness of the prophylaxis practices was low (81.57% and many patients experienced inappropriate prophylaxis practices (18.43%. Mechanical prophylaxis was used predominantly and GCS was used more than IPC. In pharmacological form of prophylaxis LMWH was used more than UFH and appears to be the prophylaxis of choice. Inspite of multiple guidelines on risk factors assessment for venous thromboembolism (VTE, utilization of deep venous thrombosis (DVT prophylaxis remains less than satisfactory in non ICU hospitalized patients

  8. Towards a Theory for Integration of Mathematical Verification and Empirical Testing (United States)

    Lowry, Michael; Boyd, Mark; Kulkarni, Deepak


    From the viewpoint of a project manager responsible for the V&V (verification and validation) of a software system, mathematical verification techniques provide a possibly useful orthogonal dimension to otherwise standard empirical testing. However, the value they add to an empirical testing regime both in terms of coverage and in fault detection has been difficult to quantify. Furthermore, potential cost savings from replacing testing with mathematical verification techniques cannot be realized until the tradeoffs and synergies can be formulated. Integration of formal verification with empirical testing is also difficult because the idealized view of mathematical verification providing a correctness proof with total coverage is unrealistic and does not reflect the limitations imposed by computational complexity of mathematical techniques. This paper first describes a framework based on software reliability and formalized fault models for a theory of software design fault detection - and hence the utility of various tools for debugging. It then describes a utility model for integrating mathematical and empirical techniques with respect to fault detection and coverage analysis. It then considers the optimal combination of black-box testing, white-box (structural) testing, and formal methods in V&V of a software system. Using case studies from NASA software systems, it then demonstrates how this utility model can be used in practice.

  9. How to deal with double partial verification when evaluating two index tests in relation to a reference test?

    NARCIS (Netherlands)

    van Geloven, Nan; Brooze, Kimiko A.; Opmeer, Brent C.; Mol, Ben Willem; Zwinderman, Aeilko H.


    Research into the diagnostic accuracy of clinical tests is often hampered by single or double partial verification mechanisms, that is, not all patients have their disease status verified by a reference test, neither do all patients receive all tests under evaluation (index tests). We show methods

  10. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman


    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  11. Environmental Technology Verification--Baghouse Filtration Products: GE Energy QG061 Filtration Media (Tested September 2008) (United States)

    This report reviews the filtration and pressure drop performance of GE Energy's QG061 filtration media. Environmental Technology Verification (ETV) testing of this technology/product was conducted during a series of tests in September 2008. The objective of the ETV Program is to ...

  12. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  13. 78 FR 33132 - Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... (United States)


    ... COMMISSION Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test Reactors.'' This guide... plate-type uranium-aluminum fuel elements used in research and test reactors (RTRs). ADDRESSES: Please...

  14. Comparison of intensities and rest periods for VO2max verification testing procedures. (United States)

    Nolan, P B; Beaven, M L; Dalleck, L


    We sought to determine the incidence of 'true' VO2max confirmation with the verification procedure across different protocols. 12 active participants (men n=6, women n=6) performed in random order 4 different maximal graded exercises tests (GXT) and verification bout protocols on 4 separate days. Conditions for the rest period and verification bout intensity were: A - 105% intensity, 20 min rest; B - 105% intensity, 60 min rest; C - 115% intensity, 20 min rest; D - 115% intensity, 60 min rest. VO2max confirmation (difference between peak VO2 GXT and verification trialVO2max confirmation across all exercise test conditions (intensity effect within recovery 20 min (χ(2) (1)=4.800, pVO2max confirmation with different rest periods. We recommend the use of 105% of the maximal GXT workload and 20 min rest periods when using verification trials to confirm VO2max in normally active populations. © Georg Thieme Verlag KG Stuttgart · New York.

  15. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols (United States)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio


    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  16. Verification test calculations for the Source Term Code Package

    Energy Technology Data Exchange (ETDEWEB)

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L


    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  17. Dynamic Isotope Power System: technology verification phase. Test plan. 79-KIPS-6

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, G.D.


    The objective of this document is to outline the test plan for the KIPS Technology Verification Program. This test plan is inclusive of component simulating (rig) testing, component testing and system testing. Rig testing will prove concept feasibility, measure basic performance and to develop the hardware necessary prior to initiation of GDS component part manufacture. Component testing will measure basic performance and verify component integrity prior to GDS assembly. The GDS system testing will: simulate the flight system operation; determine the life limiting components; measure performance and relate to potential system lifetime; demonstrate 18+% DC generating efficiency; and perform a 5000 h endurance test with final configuration hardware.

  18. Effect of verification bias on the sensitivity of fecal occult blood testing: a meta-analysis. (United States)

    Rosman, Alan S; Korsten, Mark A


    There is controversy regarding the sensitivity of fecal occult blood tests (FOBT) for detecting colorectal cancer. Many of the published studies failed to correct for verification bias which may have increased the sensitivity. A meta-analysis of published studies evaluating the sensitivity and specificity of chemical-based FOBT for colorectal cancer was performed. Studies were included if both cancer and control subjects underwent confirmatory testing. We also included studies that attempted to correct for verification bias by either performing colonoscopy on all subjects regardless of the FOBT result or by using longitudinal follow-up. We then compared the sensitivity, specificity, and other diagnostic characteristics of the studies that attempted to correct for verification (n=10) vs. those that did not correct for this bias (n=19). The pooled sensitivity of guaiac-based FOBT for colorectal cancer of studies without verification bias was significantly lower than those studies with this bias [0.36 (95% CI 0.25-0.47) vs. 0.70 (95% CI 0.60-0.80), p=0.001]. The pooled specificity of the studies without verification bias was higher [0.96 (95% CI 0.94-0.97) vs. 0.88 (95% CI 0.84-0.91), p<0.005]. There was no significant difference in the area under the summary receiver operating characteristic curves. More sensitive chemical-based FOBT methods (e.g., Hemoccult® SENSA®) had a higher sensitivity but a lower specificity than standard guaiac methods. The sensitivity of guaiac-based FOBT for colorectal cancer has been overestimated as a result of verification bias. This test may not be sensitive enough to serve as an effective screening option for colorectal cancer.

  19. Verification Testing to Confirm VO2max in Altitude-Residing, Endurance-Trained Runners. (United States)

    Weatherwax, R M; Richardson, T B; Beltz, N M; Nolan, P B; Dalleck, L


    We sought to explore the utility of the verification trial to confirm individual attainment of 'true' VO2max in altitude-residing, endurance-trained runners during treadmill exercise. 24 elite endurance-trained men and women runners (age=21.5±3.3 yr, ht=174.8±9.3 cm, body mass=60.5±6.7 kg, PR 800 m 127.5±13.1 s) completed a graded exercise test (GXT) trial (VO2max=60.0±5.8 mL·kg(-1)·min(-1)), and returned 20 min after incremental exercise to complete a verification trial (VO2max=59.6±5.7 mL·kg(-1)·min(-1)) of constant load, supramaximal exercise. The incidence of 'true' VO2max confirmation using the verification trial was 24/24 (100%) with all participants revealing differences in VO2max≤3% (the technical error of our equipment) between the GXT and verification trials. These findings support use of the verification trial to confirm VO2max attainment in altitude-residing, endurance-trained runners. © Georg Thieme Verlag KG Stuttgart · New York.

  20. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen... (United States)


    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with Microorganisms... § 381.94 Contamination with Microorganisms; process control verification criteria and testing; pathogen... maintaining process controls sufficient to prevent fecal contamination. FSIS shall take further action as...

  1. Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)

    DEFF Research Database (Denmark)


    This report documents the program and the outcomes of Dagstuhl Seminar 13091 “Analysis, Test and Verification in The Presence of Variability”. The seminar had the goal of consolidating and stimulating research on analysis of software models with variability, enabling the design of variability-awa...

  2. 40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements. (United States)


    ... ENGINES (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification testing requirements. 86.1845-01 Section 86.1845-01 Protection of Environment ENVIRONMENTAL PROTECTION...

  3. 40 CFR 86.1845-04 - Manufacturer in-use verification testing requirements. (United States)


    ... ENGINES (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification testing requirements. 86.1845-04 Section 86.1845-04 Protection of Environment ENVIRONMENTAL PROTECTION...

  4. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder


    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  5. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder


    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded...... against the plate using dead-weights. The block has two holders for test specimens, which form line contacts with the plate. A force transducer is used to measure the frictional force between the block and the plate. During verification of the test rig unwanted ripples on the signal recorded from...... the force transducer were discovered. An identification process is undertaken in order to find the source of this disturbance and to reduce the effect as much as possible. Second a reproducibility test is conducted to check the reliability of the test rig. The outcome of this work is a verified test rig...

  6. HDTS 2017.1 Testing and Verification Document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012; Whiteside, 2017b). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect.

  7. RELAP5-3D Restart and Backup Verification Testing

    Energy Technology Data Exchange (ETDEWEB)

    Mesina, George L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Existing testing methodology for RELAP5-3D employs a set of test cases collected over two decades to test a variety of code features and run on a Linux or Windows platform. However, this set has numerous deficiencies in terms of code coverage, detail of comparison, running time, and testing fidelity of RELAP5-3D restart and backup capabilities. The test suite covers less than three quarters of the lines of code in the relap directory and just over half those in the environmental library. Even in terms of code features, many are not covered. Moreover, the test set runs many problems long past the point necessary to test the relevant features. It requires standard problems to run to completion. This is unnecessary for features can be tested in a short-running problem. For example, many trips and controls can be tested in the first few time steps, as can a number of fluid flow options. The testing system is also inaccurate. For the past decade, the diffem script has been the primary tool for checking that printouts from two different RELAP5-3D executables agree. This tool compares two output files to verify that all characters are the same except for those relating to date, time and a few other excluded items. The variable values printed on the output file are accurate to no more than eight decimal places. Therefore, calculations with errors in decimal places beyond those printed remain undetected. Finally, fidelity of restart is not tested except in the PVM sub-suite and backup is not specifically tested at all. When a restart is made from any midway point of the base-case transient, the restart must produce the same values. When a backup condition occurs, the code repeats advancements with the same time step. A perfect backup can be tested by forcing RELAP5 to perform a backup by falsely setting a backup condition flag at a user-specified-time. Comparison of the calculations of that run and those produced by the same input w/o the spurious condition should be

  8. DKIST enclosure modeling and verification during factory assembly and testing (United States)

    Larrakoetxea, Ibon; McBride, William; Marshall, Heather K.; Murga, Gaizka


    The Daniel K. Inouye Solar Telescope (DKIST, formerly the Advanced Technology Solar Telescope, ATST) is unique as, apart from protecting the telescope and its instrumentation from the weather, it holds the entrance aperture stop and is required to position it with millimeter-level accuracy. The compliance of the Enclosure design with the requirements, as of Final Design Review in January 2012, was supported by mathematical models and other analyses which included structural and mechanical analyses (FEA), control models, ventilation analysis (CFD), thermal models, reliability analysis, etc. During the Enclosure Factory Assembly and Testing the compliance with the requirements has been verified using the real hardware and the models created during the design phase have been revisited. The tests performed during shutter mechanism subsystem (crawler test stand) functional and endurance testing (completed summer 2013) and two comprehensive system-level factory acceptance testing campaigns (FAT#1 in December 2013 and FAT#2 in March 2014) included functional and performance tests on all mechanisms, off-normal mode tests, mechanism wobble tests, creation of the Enclosure pointing map, control system tests, and vibration tests. The comparison of the assumptions used during the design phase with the properties measured during the test campaign provides an interesting reference for future projects.

  9. HDTS 2017.0 Testing and verification document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, Tad S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assur- ance documents including (Foley and Powell, 2010; Dixon, 2012). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the eld system performs within speci cations. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect. These tests con rm HDTS version 2017.0 performs according to its speci cations and documentation and that its performance meets the needs of its users at the Savannah River Site.

  10. Verification of test battery of motoric assumptions for tennis


    Křelina, Vladimír


    This thesis focuses on testing the motoric assumptions of junior category tennis players in certain sport games. The aim of this thesis is to compare the results of the motoric test regarding to three tennis players of various performance levels in chosen sport games. Thus define the substantive significance and specificity of each test towards tennis. The assumptions in the theoretical part are based on my Bachelor thesis. In said thesis I am dealing with the characteristics of tennis, the s...

  11. Small-scale fixed wing airplane software verification flight test (United States)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  12. TQAP for Verification of Qualitative Lead Test Kits (United States)

    There are lead-based paint test kits available to help home owners and contractors identify lead-based paint hazards before any Renovation, Repair, and Painting (RRP) activities take place so that proper health and safety meaures can be enacted. However, many of these test kits ...

  13. Warm Water Oxidation Verification - Scoping and Stirred Reactor Tests

    Energy Technology Data Exchange (ETDEWEB)

    Braley, Jenifer C.; Sinkov, Sergey I.; Delegard, Calvin H.; Schmidt, Andrew J.


    Scoping tests to evaluate the effects of agitation and pH adjustment on simulant sludge agglomeration and uranium metal oxidation at {approx}95 C were performed under Test Instructions(a,b) and as per sections 5.1 and 5.2 of this Test Plan prepared by AREVA. (c) The thermal testing occurred during the week of October 4-9, 2010. The results are reported here. For this testing, two uranium-containing simulant sludge types were evaluated: (1) a full uranium-containing K West (KW) container sludge simulant consisting of nine predominant sludge components; (2) a 50:50 uranium-mole basis mixture of uraninite [U(IV)] and metaschoepite [U(VI)]. This scoping study was conducted in support of the Sludge Treatment Project (STP) Phase 2 technology evaluation for the treatment and packaging of K-Basin sludge. The STP is managed by CH2M Hill Plateau Remediation Company (CHPRC) for the U.S. Department of Energy. Warm water ({approx}95 C) oxidation of sludge, followed by immobilization, has been proposed by AREVA and is one of the alternative flowsheets being considered to convert uranium metal to UO{sub 2} and eliminate H{sub 2} generation during final sludge disposition. Preliminary assessments of warm water oxidation have been conducted, and several issues have been identified that can best be evaluated through laboratory testing. The scoping evaluation documented here was specifically focused on the issue of the potential formation of high strength sludge agglomerates at the proposed 95 C process operating temperature. Prior hydrothermal tests conducted at 185 C produced significant physiochemical changes to genuine sludge, including the formation of monolithic concretions/agglomerates that exhibited shear strengths in excess of 100 kPa (Delegard et al. 2007).


    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  15. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders (United States)

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  16. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4) (United States)

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  17. Shaking table test and verification of development of an ...

    Indian Academy of Sciences (India)

    ... semi-active hydraulic damper (ASHD) is converted to interaction element (IE) of active interaction control (AIC). Systemic equations of motion, control law and control rulers of this proposed new AIC are studied in this research. A full-scale multiple degrees of freedom shaking table is tested toverify the energy dissipation of ...

  18. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)


    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  19. Final tests and performances verification of the European ALMA antennas (United States)

    Marchiori, Gianpietro; Rampini, Francesco


    The Atacama Large Millimeter Array (ALMA) is under erection in Northern Chile. The array consists of a large number (up to 64) of 12 m diameter antennas and a number of smaller antennas, to be operated on the Chajnantor plateau at 5000 m altitude. The antennas will operate up to 950 GHz so that their mechanical performances, in terms of surface accuracy, pointing precision and dimensional stability, are very tight. The AEM consortium constituted by Thales Alenia Space France, Thales Alenia Space Italy, European Industrial Engineering (EIE GROUP), and MT Mechatronics is assembling and testing the 25 antennas. As of today, the first set of antennas have been delivered to ALMA for science. During the test phase with ESO and ALMA, the European antennas have shown excellent performances ensuring the specification requirements widely. The purpose of this paper is to present the different results obtained during the test campaign: surface accuracy, pointing error, fast motion capability and residual delay. Very important was also the test phases that led to the validation of the FE model showing that the antenna is working with a good margin than predicted at design level thanks also to the assembly and integration techniques.

  20. Verification and application of the Iosipescu shear test method (United States)

    Walrath, D. E.; Adams, D. F.


    Finite element models were used to study the effects of notch angle variations on the stress state within an Iosipescu shear test speciment. These analytical results were also studied to determine the feasibility of using strain gage rosettes and a modified extensometer to measure shear strains in this test specimen. Analytical results indicate that notch angle variations produced only small differences in simulated shear properties. Both strain gage rosettes and the modified extensometer were shown to be feasible shear strain transducers for the test method. The Iosipoescu shear test fixture was redesigned to incorporate several improvements. These improvements include accommodation of a 50 percent larger specimen for easier measurement of shear train, a clamping mechanism to relax strict tolerances on specimen width, and a self contained alignment tool for use during specimen installation. A set of in-plane and interlaminar shear properties were measured for three graphite fabric/epoxy composites of T300/934 composite material. The three weave patterns were Oxford, 5-harness satin, and 8-harness satin.

  1. Interim report on verification and benchmark testing of the NUFT computer code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, K.H.; Nitao, J.J. [Lawrence Livermore National Lab., CA (United States); Kulshrestha, A. [Weiss Associates, Emeryville, CA (United States)


    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  2. [Implication of inverse-probability weighting method in the evaluation of diagnostic test with verification bias]. (United States)

    Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin


    To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.

  3. Modified porosity rate frost heave model and tests verification (United States)

    Ji, Zhi-qiang; Xu, Xue-yan


    To avoid the complexity of modeling frost heave from microscope, porosity rate function has been used in predication of frost heave phenomenon. The approach explored in this paper is based on frost heave tests and the concept of the segregated potential which has been widely accepted by researchers in order to find the proper form of the porosity rate function. In the frozen fringe the porosity rate function was derived: n•=Be(-aPe) (gradT)2 (1-n) , (Tstests were carried out to verify the model, and the comparison between test results and analog results shows that the modified model is efficient for the prediction of frost heave, and it can be used in engineering practice.

  4. Residual flexibility test method for verification of constrained structural models (United States)

    Admire, John R.; Tinker, Michael L.; Ivey, Edward W.


    A method is described for deriving constrained modes and frequencies from a reduced model based on a subset of the free-free modes plus the residual effects of neglected modes. The method involves a simple modification of the MacNeal and Rubin component mode representation to allow development of a verified constrained (fixed-base) structural model. Results for two spaceflight structures having translational boundary degrees of freedom show quick convergence of constrained modes using a measureable number of free-free modes plus the boundary partition of the residual flexibility matrix. This paper presents the free-free residual flexibility approach as an alternative test/analysis method when fixed-base testing proves impractical.

  5. Cryogenic Fluid Management Experiment (CFME) trunnion verification testing (United States)

    Bailey, W. J.; Fester, D. A.


    The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).

  6. Verification Test of Automated Robotic Assembly of Space Truss Structures (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.


    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  7. Manual and automation testing and verification of TEQ [ECI PROPIRETRY (United States)

    Abhichandra, Ravi; Jasmine Pemeena Priyadarsini, M.


    The telecommunication industry has progressed from 1G to 4G and now 5G is gaining prominence. Given the pace of this abrupt transformation, technological obsolescence is becoming a serious issue to deal with. Adding to this fact is that the execution of each technology requires ample investment into network, infrastructure, development etc. As a result, the industry is becoming more dynamic and strategy oriented. It requires professionals who not only can understand technology but also can evaluate the same from a business perspective. The “Information Revolution” and the dramatic advances in telecommunications technology, which has made this possible, currently drive the global economy in large part. As wireless networks become more advanced and far-reaching, we are redefining the notion of connectivity and the possibilities of communications technology. In this paper I test and verify the optical cards and automate this test procedure by using a new in-house technology “TEQ” developed by ECI TELECOM which uses one the optical cards itself to pump traffic of 100gbps.

  8. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007) (United States)

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  9. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others


    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  10. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.


    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  11. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2 (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.


    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  12. Shear Ram Verification Test Protocol (VTP) Best Practices

    Energy Technology Data Exchange (ETDEWEB)

    Lindley, Roy A. [Argonne National Lab. (ANL), Argonne, IL (United States); Braun, Joseph C. [Argonne National Lab. (ANL), Argonne, IL (United States)


    A blowout preventer (BOP) is a critical component used on subsea oil and gas wells during drilling, completion, and workover operations on the U. S. outer continental shelf (OCS). The purpose of the BOP is to seal oil and gas wells, and in the case of an emergency well-control event, to prevent the uncontrolled release of hydrocarbons. One of the most important components of the BOP is the hydraulically operated blind shear ram (BSR) that shears drilling-related components, such as drill pipes, casings, tubings, and wire-related tools that may have been placed in the well. In addition to shearing these components, the BSR must form a seal to keep hydrocarbons within the well bore, even when under the highest well-fluid pressures expected. The purpose of this document is for Argonne National Laboratory (ANL) to provide an independent view, based on current regulations, and best practices for testing and confirming the operability and suitability of BSRs under realistic (or actual) well conditions.

  13. ATPD-2354 Revision 10 Verification Test, Disc Brake Version Only (16 NOV 06) Article Test of High Mobility Multipurpose Wheeled Vehicle (HMMWV-ECV) (United States)


    070169 ATPD-2354 REVISION 10 VERIFICATION TEST, DISC BRAKE VERSION ONLY (16 NOV 06) ARTICLE TEST OF HIGH MOBILITY MULTIPURPOSE WHEELED...Verification Test, Disc Brake Verson Only (16 NOV 06) Article Test of High Mobility Multipurpose Wheeled Vehicle (HMMWV-ECV) 5a. CONTRACT NUMBER...different characteristics critical to the proper field service of the brake pads and rotor combination, an assortment of tests was conducted to evaluate

  14. Exploring different attributes of source information for speaker verification with limited test data. (United States)

    Das, Rohan Kumar; Mahadeva Prasanna, S R


    This work explores mel power difference of spectrum in subband, residual mel frequency cepstral coefficient, and discrete cosine transform of the integrated linear prediction residual for speaker verification under limited test data conditions. These three source features are found to capture different attributes of source information, namely, periodicity, smoothed spectrum information, and shape of the glottal signal, respectively. On the NIST SRE 2003 database, the proposed combination of the three source features performs better [equal error rate (EER): 20.19%, decision cost function (DCF): 0.3759] than the mel frequency cepstral coefficient feature (EER: 22.31%, DCF: 0.4128) for 2 s duration of test segments.

  15. Testing and verification of a novel single-channel IGBT driver circuit

    Directory of Open Access Journals (Sweden)

    Lukić Milan


    Full Text Available This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new designs. It is a part of new 20kW industrial-grade boost converter.

  16. Verification of MCNP and DANT/sys With the Analytic Benchmark Test Set

    Energy Technology Data Exchange (ETDEWEB)

    Parsons, D.K.; Sood, A.; Forster, R.A.; Little, R.C.


    The recently published analytic benchmark test set has been used to verify the multigroup option of MCNP and also the deterministic DANT/sys series of codes for criticality calculations. All seventy-five problems of the test set give values for K{sub eff} accurate to at least five significant digits. Flux ratios and flux shapes are also available for many of the problems. All seventy-five problems have been run by both the MCNP and DANT/sys codes and comparisons to K{sub eff} and flux shapes have been made. Results from this verification exercise are given below.

  17. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez


    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  18. Power Performance Verification of a Wind Farm Using the Friedman's Test. (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L


    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  19. Power Performance Verification of a Wind Farm Using the Friedman’s Test (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.


    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  20. Verification testing of the PKI collector at Sandia National Laboratories, Albuquerque, New Mexico (United States)

    Hauger, J. S.; Pond, S. L.


    Verification testing of a solar collector was undertaken prior to its operation as part of an industrial process heat plant at Capitol Concrete Products in Topeka, Kansas. Testing was performed at a control plant installed at Sandia National Laboratory, Albuquerque, New Mexico (SNLA). Early results show that plant performance is even better than anticipated and far in excess of test criteria. Overall plant efficiencies of 65 to 80 percent were typical during hours of good insolation. A number of flaws and imperfections were detected during operability testing, the most important being a problem in elevation drive alignment due to a manufacturing error. All problems were corrected as they occurred and the plant, with over 40 hours of operation, is currently continuing operability testing in a wholly-automatic mode.

  1. Inferior vena cava anomalies-a common cause of DVT and PE commonly not diagnosed. (United States)

    Nanda, Sudip; Bhatt, Surya Prakash; Turki, Mohamed A


    A 62-year-old white male presented with recurrent pulmonary embolism (PE) despite having an inferior vena cava (IVC) filter. Investigations ruled out upper limb deep vein thrombosis (DVT) and IVC thrombus, the common causes for a PE in the presence of IVC filter. The culprit was double IVC with a persisting left supracardinal vein that allowed an alternate route for the leg DVT to cause PE. IVC anomalies have a propensity to cause lower limb DVT. Although rarely suspected recent studies have revealed that IVC anomalies are not rare if anticipated and evaluated. Chest CT scans in cases of suspected idiopathic PE should extend up to the renal veins as this will identify common IVC anomalies. Therapy to prevent recurrent DVT can be instituted. A good quality venacavagram should always precede any IVC filter placement as this will identify almost all IVC anomalies and appropriate steps can prevent a recurrent PE.

  2. Deep Vein Thrombosis (DVT) / Pulmonary Embolism (PE) - Blood Clot Forming in a Vein (United States)

    ... Facebook Tweet Share Compartir Deep Vein Thrombosis and Pulmonary Embolism (DVT/PE) are often underdiagnosed and serious, but ... bloodstream to the lungs, causing a blockage called pulmonary embolism (PE). If the clot is small, and with ...

  3. Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Al-Ayat, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walter, W. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treaty verification and nonproliferation.

  4. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element (United States)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide


    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  5. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  6. Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features (United States)

    Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed


    during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.

  7. Verification of FPGA-Signal using the test board which is applied to Safety-related controller

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Youn-Hu; Yoo, Kwanwoo; Lee, Myeongkyun; Yun, Donghwa [SOOSAN ENS, Seoul (Korea, Republic of)


    This article aims to provide the verification method for BGA-type FPGA of Programmable Logic Controller (PLC) developed as Safety Class. The logic of FPGA in the control device with Safety Class is the circuit to control overall logic of PLC. Saftety-related PLC must meet the international standard specifications. With this reason, we use V and V according to an international standard in order to secure high reliability and safety. By using this, we are supposed to proceed to a variety of verification courses for extra reliability and safety analysis. In order to have efficient verification of test results, we propose the test using the newly changed BGA socket which can resolve the problems of the conventional socket on this paper. The Verification of processes is divided into verification of Hardware and firmware. That processes are carried out in the unit testing and integration testing. The proposed test method is simple, the effect of cost reductions by batch process. In addition, it is advantageous to measure the signal from the Hi-speed-IC due to its short length of the pins and it was plated with the copper around it. Further, it also to prevent abrasion on the IC ball because it has no direct contact with the PCB. Therefore, it can be actually applied is to the BGA package test and we can easily verify logic as well as easily checking the operation of the designed data.

  8. 77 FR 16868 - Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... (United States)


    ... COMMISSION Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test...-Type Uranium-Aluminum Fuel Elements for Use in Research and Test Reactors,'' is temporarily identified... verifying the quality of plate-type uranium-aluminum fuel elements used in research and test reactors (RTRs...

  9. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report (United States)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.


    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  10. NDEC: A NEA platform for nuclear data testing, verification and benchmarking

    Directory of Open Access Journals (Sweden)

    Díez C.J.


    Full Text Available The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.

  11. Test Method for Thermal Characterization of Li-Ion Cells and Verification of Cooling Concepts

    Directory of Open Access Journals (Sweden)

    Rouven Christen


    Full Text Available Temperature gradients, thermal cycling and temperatures outside the optimal operation range can have a significant influence on the reliability and lifetime of Li-ion battery cells. Therefore, it is essential for the developer of large-scale battery systems to know the thermal characteristics, such as heat source location, heat capacity and thermal conductivity, of a single cell in order to design appropriate cooling measures. This paper describes an advanced test facility, which allows not only an estimation of the thermal properties of a battery cell, but also the verification of proposed cooling strategies in operation. To do this, an active measuring unit consisting of a temperature and heat flux density sensor and a Peltier element was developed. These temperature/heat flux sensing (THFS units are uniformly arranged around a battery cell with a spatial resolution of 25 mm. Consequently, the temperature or heat flux density can be controlled individually, thus forming regions with constant temperature (cooling or zero heat flux (insulation. This test setup covers the whole development loop from thermal characterization to the design and verification of the proposed cooling strategy.

  12. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour. (United States)

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A


    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  13. Providing an empirical basis for optimizing the verification and testing phases of software development (United States)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.


    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  14. Constrained structural dynamic model verification using free vehicle suspension testing methods (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna


    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.


    Directory of Open Access Journals (Sweden)

    Dian Megasafitri


    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE Deep Vein Thrombosis (DVT is the formation of a blood clot (thrombus in a vein in which to channel blood back to the heart. Traumatic injury is one of the important risk factors for DVT formation. Thrombus formation involves three important factors include the blood flow, blood components, and blood vessels, known as Virchow's Triad. Classical findings of pain in the calf of foot at dorsiflexion position (Homans sign is a sign of a specific but not sensitive and occurs in half of patients with DVT. A thorough history and physical examination is very important in the approach to patients with suspicion of having DVT. Radiological examination is an important examination in diagnosing DVT. Although there are many choices modality, the level 1 clinical evidence now supports the use of pharmacologic therapy with anticoagulants Low-Molecular Weight Heparin (LMWH for primary DVT prophylaxis agent. Different types of LMWH have different indications approved by the Food and Drug Administration (FDA as DVT prophylaxis based on the varieties of clinical evidence. Enoxaparin is the most widely indicated as a prophylaxis and treatment for DVT. Tinzaparin is indicated as a therapy, but not as a DVT prophylaxis in some groups of patients. Dalteparin is indicated as a prophylaxis, but not as a DVT therapy. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso

  16. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)


    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  17. DVT surveillance program in the ICU: analysis of cost-effectiveness.

    Directory of Open Access Journals (Sweden)

    Ajai K Malhotra

    Full Text Available BACKGROUND: Venous Thrombo-embolism (VTE--Deep venous thrombosis (DVT and/or pulmonary embolism (PE--in traumatized patients causes significant morbidity and mortality. The current study evaluates the effectiveness of DVT surveillance in reducing PE, and performs a cost-effectiveness analysis. METHODS: All traumatized patients admitted to the adult ICU underwent twice weekly DVT surveillance by bilateral lower extremity venous Duplex examination (48-month surveillance period--SP. The rates of DVT and PE were recorded and compared to the rates observed in the 36-month pre-surveillance period (PSP. All patients in both periods received mechanical and pharmacologic prophylaxis unless contraindicated. Total costs--diagnostic, therapeutic and surveillance--for both periods were recorded and the incremental cost for each Quality Adjusted Life Year (QALY gained was calculated. RESULTS: 4234 patients were eligible (PSP--1422 and SP--2812. Rate of DVT in SP (2.8% was significantly higher than in PSP (1.3% - p<0.05, and rate of PE in SP (0.7% was significantly lower than that in PSP (1.5% - p<0.05. Logistic regression demonstrated that surveillance was an independent predictor of increased DVT detection (OR: 2.53 - CI: 1.462-4.378 and decreased PE incidence (OR: 0.487 - CI: 0.262-0.904. The incremental cost was $509,091/life saved in the base case, translating to $29,102/QALY gained. A sensitivity analysis over four of the parameters used in the model indicated that the incremental cost ranged from $18,661 to $48,821/QALY gained. CONCLUSIONS: Surveillance of traumatized ICU patients increases DVT detection and reduces PE incidence. Costs in terms of QALY gained compares favorably with other interventions accepted by society.

  18. Development of a test system for verification and validation of nuclear transport simulations

    Energy Technology Data Exchange (ETDEWEB)

    White, Morgan C [Los Alamos National Laboratory; Triplett, Brian S [GENERAL ELECTRIC; Anghaie, Samim [UNIV OF FL


    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  19. Digital volume tomography (DVT) as a diagnostic modality of the anterior skull base. (United States)

    Bremke, Martin; Sesterhenn, Andreas M; Murthum, Tobias; Al Hail, Amira; Bien, Siegfried; Werner, Jochen Alfred


    Because of high resolution and the relatively lower costs in comparison with modern helical CT scanners, digital volume tomography (DVT) can be recommended in the diagnosis of the nasal cavity and paranasal sinuses. DVT is an advancement of panoramic tomography and is based on the principles of rotational tomography. It enables high resolution visualization of osseous structures. The slices can be displayed in three orthogonal planes that can be changed in angle arbitrarily. Data volumes of up to 12×17 cm can be examined with a new generation of the DVT. The aim of this study was to point out the potential of DVT in the anterior skull base. DVT scans with a cylindrical size of 10 cm in diameter and 10 cm in height were performed in 23 patients. The identification of surgical key landmarks (uncinate process, middle turbinate, ethmoidal bulla, agger nasi cells, Haller cells, frontal recess, anterior ethmoidal artery in its relationship to the skull base, the cribiform plate of the sphenoidal sinus in relation to the optic nerve, and the internal carotid artery) was evaluated. Display of the essential surgical key landmarks was possible in all patients.

  20. Inverse transport for the verification of the Comprehensive Nuclear Test Ban Treaty

    Directory of Open Access Journals (Sweden)

    J.-P. Issartel


    Full Text Available An international monitoring system is being built as a verification tool for the Comprehensive Test Ban Treaty. Forty stations will measure on a worldwide daily basis the concentration of radioactive noble gases. The paper introduces, by handling preliminary real data, a new approach of backtracking for the identification of sources of passive tracers after positive measurements. When several measurements are available the ambiguity about possible sources is reduced significantly. The approach is validated against ETEX data. A distinction is made between adjoint and inverse transport shown to be, indeed, different though equivalent ideas. As an interesting side result it is shown that, in the passive tracer dispersion equation, the diffusion stemming from a time symmetric turbulence is necessarily a self-adjoint operator, a result easily verified for the usual gradient closure, but more general.

  1. A digital volumetric tomography (DVT study in the mandibular molar region for miniscrew placement during mixed dentition

    Directory of Open Access Journals (Sweden)

    Mayur S. Bhattad


    Full Text Available OBJECTIVE: To assess bone thickness for miniscrew placement in the mandible during mixed dentition by using digital volumetric tomograph (DVT. MATERIAL AND METHODS: A total of 15 healthy patients aged 8-10 years old, with early exfoliated mandibular second deciduous molar, were included. DVT images of one quadrant of the mandible were obtained using Kodak extraoral imaging systems and analyzed by Kodak dental imaging software. The error of the method (EM was calculated using Dahlberg's formula. Mean and standard deviation were calculated at 6 and 8 mm from the cementoenamel junction (CEJ.Paired t-test was used to analyze the measurements. RESULTS: Buccal cortical bone thickness, mesiodistal width and buccolingual bone depth at 6 mm were found to be 1.73 + 0.41, 2.15 + 0.49 and 13.18 + 1.22 mm, respectively; while at 8 mm measurements were 2.42 + 0.34, 2.48 + 0.33 and 13.65 + 1.25 mm, respectively. EM for buccal cortical bone thickness, mesiodistal width and buccolingual bone depth was 0.58, 0.40 and 0.48, respectively. The difference in measurement at 6 and 8 mm for buccal cortical plate thickness (P 0.05. CONCLUSION: Bone thickness measurement has shown promising evidence for safe placement of miniscrews in the mandible during mixed dentition. The use of miniscrew is the best alternative, even in younger patients.

  2. An audit of intermittent pneumatic compression (IPC) in the prophylaxis of asymptomatic deep vein thrombosis (DVT). (United States)

    Illingworth, Clare; Timmons, Stephen


    This paper reports a prospective audit, against an existing baseline standard, for intermittent pneumatic compression (IPC) in the prophylaxis of asymptomatic deep vein thrombosis (DVT). This was done via a structured questionnaire, using the methodology of total population sampling, encapsulating all theatre staff within one NHS trust. With regards to the standard, performance is good, as IPC is DVT prophylaxis of choice in the perioperative area and is used frequently on most patients. The findings of the audit do, however, highlight the need for appropriate local DVT risk assessment guidelines, essential to ensure that prophylaxis is administered to the correct at risk groups, as prevention may be unjustified in low risk groups and possibly inappropriate for the same regimen to be used for all patients.

  3. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  4. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example. (United States)

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D


    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Verification testing of the compression performance of the HEVC screen content coding extensions (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng


    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  6. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili


    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  7. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia


    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  8. The USP Performance Verification Test, Part II: collaborative study of USP's Lot P Prednisone Tablets. (United States)

    Glasgow, Maria; Dressman, Shawn; Brown, William; Foster, Thomas; Schuber, Stefan; Manning, Ronald G; Wahab, Samir Z; Williams, Roger L; Hauck, Walter W


    Periodic performance verification testing (PVT) is used by laboratories to assess and demonstrate proficiency and for other purposes as well. For dissolution, the PVT is specified in the US Pharmacopeia General Chapter Dissolution under the title Apparatus Suitability Test. For Apparatus 1 and 2, USP provides two reference standard tablets for this purpose. For each new lot of these reference standards, USP conducts a collaborative study. For new USP Lot P Prednisone Tablets, 28 collaborating laboratories provided data. The study was conducted with three sets of tablets: Lot O open label, Lot O blinded, and Lot P blinded. The blinded Lot O data were used for apparatus suitability testing. Acceptance limits were determined after dropping data due to failure of apparatus suitability, identification of data as unusual on control charts, or protocol violations. Results yielded acceptance criteria of (47, 82) for Apparatus 1 and (37, 70) for Apparatus 2. Results generally were similar for Lot P compared to results from Lot O except that the average percent dissolved for Lot P is greater than for Lot O with Apparatus 2.

  9. Validation of Test Methods for Air Leak Rate Verification of Spaceflight Hardware (United States)

    Oravec, Heather Ann; Daniels, Christopher C.; Mather, Janice L.


    As deep space exploration continues to be the goal of NASAs human spaceflight program, verification of the performance of spaceflight hardware becomes increasingly critical. Suitable test methods for verifying the leak rate of sealing systems are identified in program qualification testing requirements. One acceptable method for verifying the air leak rate of gas pressure seals is the tracer gas leak detector method. In this method, a tracer gas (commonly helium) leaks past the test seal and is transported to the leak detector where the leak rate is quantified. To predict the air leak rate, a conversion factor of helium-to-air is applied depending on the magnitude of the helium flow rate. The conversion factor is based on either the molecular mass ratio or the ratio of the dynamic viscosities. The current work was aimed at validating this approach for permeation-level leak rates using a series of tests with a silicone elastomer O-ring. An established pressure decay method with constant differential pressure was used to evaluate both the air and helium leak rates of the O-ring under similar temperature and pressure conditions. The results from the pressure decay tests showed, for the elastomer O-ring, that neither the molecular flow nor the viscous flow helium-to-air conversion factors were applicable. Leak rate tests were also performed using nitrogen and argon as the test gas. Molecular mass and viscosity based helium-to-test gas conversion factors were applied, but did not correctly predict the measured leak rates of either gas. To further this study, the effect of pressure boundary conditions was investigated. Often, pressure decay leak rate tests are performed at a differential pressure of 101.3 kPa with atmospheric pressure on the downstream side of the test seal. In space applications, the differential pressure is similar, but with vacuum as the downstream pressure. The same O-ring was tested at four unique differential pressures ranging from 34.5 to 137.9 k

  10. Doppler scans of lower limbs' DVT: experience at University of Benin ...

    African Journals Online (AJOL)

    Venous diversion, subcutaneous oedema and leg ulcers were demonstrable in 89, 83 and 2 patients respectively. Conclusion: Lower limb's DVT is common and the diagnosis can be performed by radiologists. Consequently prompt treatment can be instituted which will minimize the morbidity and mortality associated with ...

  11. Evaluation von anatomischen Strukturen des Os temporale mittels Digitaler Volumentomographie (DVT)


    Dräger, Stephanie Johanna; Güldner, Christian (PD Dr. med.)


    Die Pars petrosa des Os temporale stellt aufgrund ihrer komplexen Anatomie hohe Anforderungen an die radiologische Diagnostik von Ohrerkrankungen. Bis heute gilt hier die konventionelle Computertomographie (MDCT, engl.: multi-detector-row-CT) als Goldstandard. Nachdem vor mehreren Jahren die Methode der Digitalen Volumentomographie (DVT, engl.: cone beam computed tomography = CBCT) Einzug in die HNO-Heilkunde hielt, ist sie zunäch...

  12. Verification testing to confirm VO2max attainment in persons with spinal cord injury. (United States)

    Astorino, Todd A; Bediamol, Noelle; Cotoia, Sarah; Ines, Kenneth; Koeu, Nicolas; Menard, Natasha; Nyugen, Brianna; Olivo, Cassandra; Phillips, Gabrielle; Tirados, Ardreen; Cruz, Gabriela Velasco


    Maximal oxygen uptake (VO2max) is a widely used measure of cardiorespiratory fitness, aerobic function, and overall health risk. Although VO2max has been measured for almost 100 yr, no standardized criteria exist to verify VO2max attainment. Studies document that incidence of 'true' VO2max obtained from incremental exercise (INC) can be confirmed using a subsequent verification test (VER). In this study, we examined efficacy of VER in persons with spinal cord injury (SCI). Repeated measures, within-subjects study. University laboratory in San Diego, CA. Ten individuals (age and injury duration = 33.3 ± 10.5 yr and 6.8 ± 6.2 yr) with SCI and 10 able-bodied (AB) individuals (age = 24.1 ± 7.4 yr). Peak oxygen uptake (VO2peak) was determined during INC on an arm ergometer followed by VER at 105 percent of peak power output (% PPO). Gas exchange data, heart rate (HR), and blood lactate concentration (BLa) were measured during exercise. Across all participants, VO2peak was highly related between protocols (ICC = 0.98) and the mean difference was equal to 0.08 ± 0.11 L/min. Compared to INC, VO2peak from VER was not different in SCI (1.30 ± 0.45 L/min vs. 1.31 ± 0.43 L/min) but higher in AB (1.63 ± 0.40 L/min vs. 1.76 ± 0.40 L/min). Data show similar VO2peak between incremental and verification tests in SCI, suggesting that VER confirms VO2max attainment. However, in AB participants completing arm ergometry, VER is essential to validate appearance of 'true' VO2peak.

  13. Electromagnetic compatibility (EMC) standard test chamber upgrade requirements for spacecraft design verification tests (United States)

    Dyer, Edward F.


    In view of the serious performance deficiencies inherent in conventional modular and welded shielding EMC test enclosures, in which multipath reflections and resonant standing waves can damage flight hardware during RF susceptibility tests, NASA-Goddard has undertaken the modification of a 20 x 24 ft modular-shielded enclosure through installation of steel panels to which ferrite tiles will be mounted with epoxy. The internally reflected RF energy will thereby be absorbed, and exterior power-line noise will be reduced. Isolation of power-line filters and control of 60-Hz ground connections will also be undertaken in the course of upgrading.

  14. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...


    The overall objective of the Environmental Testing and Verification Coatings and Coating Equipment Program is to verify pollution prevention and performance characteristics of coating technologies and make the results of the testing available to prospective coating technology use...

  16. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío


    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  17. Validation and Verification (V and V) Testing on Midscale Flame Resistant (FR) Test Method (United States)


    and M. Grady, “ High Intensity Thermal Testing of Protective Fabrics with CO2 Laser ”, STP1593 on the Tenth Symposium on Performance of Protective...transmitted heat flux through a wide range of fabrics using the CO2 laser [5] has indicated that the thickness of the air gap has a very strong effect...when alternate measurements of performance such as predicted depth of burn, transmitted fluence and Energy Transmission Factor (ETF) are used to assess

  18. Ames Research Center Mars/Pathfinder Heat Shield Design Verification ARC-JET Test (United States)

    Tran, Huy K.; Hui, Frank; Wercinski, Paul; Cartledge, Alan; Tauber, Mike; Tran, Duoc T.; Chen, Y. K.; Arnold, James O. (Technical Monitor)


    Design verification tests were performed on samples representing the aerobrake of the Mars/Pathfinder vehicle. The test specimens consisted of the SLA-561V ablator bonded to the honeycomb structure. The primary objective was to evaluate the ablation materials performance and to measure temperatures within the ablator, at the structural bondline and at the back sheet of the honeycomb structure. Other objectives were to evaluate the effect of ablative repair plug material treatment and voids in the heat shield. A total of 29 models were provided for testing in the Ames 60MW arc-jet facility. Of these, 23 models were flat-faced and six remaining models were curved edge ones, intended to simulate the conditions on the curved rim of the forebody where the maximum shear occurred. Eight sets of test conditions were used. The stagnation point heating rates varied from 47 to 240 W/cm2 and the stagnation pressures from 0.15 to 0.27 atm. (The maximum flight values are 132 W/cm2 and 0.25 atm) The majority of these runs were made at a nominal stagnation pressure of 0.25 atm. Two higher pressure runs were made to check the current (denser) ablation material for spallation, or other forms of thermal stress failure. Over 60% of the flatfaced models yielded good thermocouple data and all produced useful surface recession information. Of the five curved-edge models that were tested, only one gave good data; the remaining ones experienced model-holder failure. The test results can be summarized by noting that no failure of the ablative material was observed on any model. Also, the bondline temperature design limit of 250 C was never reached within an equivalent flight time despite a stagnation point heat load that exceeded the maximum flight value by up to 130%. At heating rates of over 200W/cm2 and stagnation pressures of 0.25 atm, or greater, the average surface recessions exceeded 0.5 cm on some models. The surface roughness increased dramatically at pressures above 0.25 atm and

  19. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)


    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  20. Some New Verification Test Problems for Multimaterial Diffusion on Meshes that are Non-Aligned with Material Boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Dawes, Alan Sidney [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Malone, Christopher M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shashkov, Mikhail Jurievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    In this report a number of new verification test problems for multimaterial diffusion will be shown. Using them we will show that homogenization of multimaterial cells in either Arbitrary Lagrangian Eulerian (ALE) or Eulerian simulations can lead to errors in the energy flow at the interfaces. Results will be presented that show that significant improvements and predictive capability can be gained by using either a surrogate supermesh, such as Thin Mesh in FLAG, or the emerging method based on Static Condensation.

  1. Epidemiologic study of patients with DVT in Birjand Vali-e-asr hospital- (2009-2014: Short Communication

    Directory of Open Access Journals (Sweden)

    Toba Kazemi


    Full Text Available Background and Aim: Deep vein thrombosis (DVT is a condition that, in case of delay in diagnosis and treatment, can lead to serious complications like pulmonary embolism. Given the importance of assessment and identification of diseases in every community, the current study aimed at assessing the epidemiology of DVT patients in Birjand. Materials and Methods: The present descriptive-analytical study was conducted on all DVT patients admitted to Birjand Vali-e-asr hospital between 2009 and 2014. A trained medical student completed each researcher-designed questionnaire. based on an intern’s history recording, a physician's orders ,and a nurse’s note. Then, the patients were called up demanding the status of the patient and disease complications, readmission ,or death. Finally, the obtained data was encoded and analyzed by SPSS(V: 18 at the significant level P<0.05. Results: During the study period,263 patients with DVThad been hospitalized in Birjand Vali-e-asr hospital .Out of the patients, 50.2% were males. Mean age of the subjects was 55.84 ± 18.45 years. In 98.1% of the cases the lower extremity was involved. The most prevalent risk factor was immobilization and the least risk factor was family history of DVT. Regarding the relationship between DVT risk factors and sex only smoking cigarettes was both significant and more prevalent. During 5 years, 3.8% of the population had died due to DVT complications. Recurrent DVT in 6% and pulmonary emboli in 3.4% of the patients were diagnosed. Conclusion: Given that the most common risk factor for DVT in our study was immobilization, prophylaxis is necessary in patients at high risk tin order to decrease occurrence possibility of DVT.

  2. A Case Report on VT from TV: DVT and PE from Prolonged Television Watching

    Directory of Open Access Journals (Sweden)

    Alan Lucerna


    Full Text Available Pulmonary embolus (PE and deep vein thrombosis are diagnoses that are commonly made in the emergency department. Well known risk factors for thromboembolic events include immobility, malignancy, pregnancy, surgery, and acquired or inherited thrombophilias, obesity, cigarette smoking, and hypertension. We present a case of a 59-year-old female who watched TV and developed leg swelling and was found to have PE and DVT.

  3. Achievement of VO2max criteria during a continuous graded exercise test and a verification stage performed by college athletes. (United States)

    Mier, Constance M; Alexander, Ryan P; Mageean, Amanda L


    The purpose of this study was to determine the incidence of meeting specific VO2max criteria and to test the effectiveness of a VO2max verification stage in college athletes. Thirty-five subjects completed a continuous graded exercise test (GXT) to volitional exhaustion. The frequency of achieving various respiratory exchange ratio (RER) and age-predicted maximum heart rate (HRmax) criteria and a VO2 plateau within 2 and 2.2 ml·kg(-1)·min(-1) (VO2max plateau was 5 (≤2 ml·kg(-1)·min(-1)) and 7 (≤2.2 ml·kg(-1)·min(-1)), RER criteria 34 (≥1.05), 32 (≥1.10), and 24 (≥1.15), HRmax criteria, 35 (VO2max and HRmax did not differ between GXT and the verification stage (53.6 ± 5.6 vs. 55.5 ± 5.6 ml·kg(-1)·min(-1) and 187 ± 7 vs. 187 ± 6 b·min(-1)); however, the RER was lower during the verification stage (1.15 ± 0.06 vs. 1.07 ± 0.07, p = 0.004). Six subjects achieved a similar VO2 (within 2.2 ml·kg(-1)·min(-1)), whereas 4 achieved a higher VO2 compared with the GXT. These data demonstrate that a continuous GXT limits the college athlete's ability to achieve VO2max plateau and certain RER and HR criteria. The use of a verification stage increases the frequency of VO2max achievement and may be an effective method to improve the accuracy of VO2max measurements in college athletes.

  4. Formal verification and testing: An integrated approach to validating Ada programs (United States)

    Cohen, Norman H.


    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  5. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen... (United States)


    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with microorganisms... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... controls sufficient to prevent fecal contamination. FSIS shall take further action as appropriate to ensure...

  6. Baghouse filtration products verification

    Energy Technology Data Exchange (ETDEWEB)

    Mycock, J.C.; Turner, J.H.; VanOsdell, D.W.; Farmer, J.R.; Brna, T.G.


    The paper introduces EPA`s Air Pollution Control Technology Verification (APCT) program and then focuses on the immediate objective of the program: laboratory performance verification of cleanable filter media intended for the control of fine particulate emissions. Data collected during the laboratory verification testing, which simulates operation in full-scale fabric filters, will be used to show expected performance for collection of particles {le} 2.5 micrometers in diameter.

  7. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)


    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  8. Test tasks for verification of program codes for calculation of neutron-physical characteristics of the BN series reactors (United States)

    Tikhomirov, Georgy; Ternovikh, Mikhail; Smirnov, Anton; Saldikov, Ivan; Bahdanovich, Rynat; Gerasimov, Alexander


    System of test tasks is presented with the fast reactor BN-1200 with nitride fuel as prototype. The system of test tasks includes three test based on different geometric models. Model of fuel element in homogeneous and in heterogeneous form, model of fuel assembly in height-heterogeneous and full heterogeneous form, and modeling of the active core of BN-1200 reactor. Cross-verification of program codes was performed. Transition from simple geometry to more complex one allows to identify the causes of discrepancies in the results during the early stage of cross-verification of codes. This system of tests can be applied for certification of engineering programs based on the method of Monte Carlo to the calculation of full-scale models of the reactor core of the BN series. The developed tasks take into account the basic layout and structural features of the reactor BN-1200. They are intended for study of neutron-physical characteristics, estimation of influence of heterogeneous structure and influence of diffusion approximation. The development of system of test tasks allowed to perform independent testing of programs for calculation of neutron-physical characteristics: engineering programs JARFR and TRIGEX, and codes MCU, TDMCC, and MMK based on the method of Monte Carlo.

  9. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Phyllis C.


    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  10. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others


    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  11. Environmental Technology Verification: Baghouse filtration products--W.L. Gore & Associates L3650 filtration media (tested November--December 2009) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  12. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6277 Filtration Media (Tested March 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  13. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6262 Filtration Media (Tested March 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  14. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6282 Filtration Media (Tested March - April 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  15. Verification and standardization of blood cell counters for routine clinical laboratory tests. (United States)

    Verbrugge, Sue Ellen; Huisman, Albert


    The use of automated blood cell counters (automated hematology analyzers) for diagnostic purposes is inextricably linked to clinical laboratories. However, the need for uniformity among the various methods and parameters is increasing and standardization of the automated analyzers is therefore crucial. Standardization not only involves procedures based on reference methods but it also involves validation, verification, quality assurance, and quality control, and it includes the involvement of several participants. This article discusses the expert guidelines and provides an overview of issues involved in complete blood count parameter reference methods and standardization of reporting units. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results (United States)

    Burken, John J.; Larson, Richard R.


    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  17. Verification test for three WindCube WLS7 LiDARs at the Høvsøre test site

    DEFF Research Database (Denmark)

    Gottschall, Julia; Courtney, Michael

    The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7-0062, and ......-0062, and in a summary for units WLS7-0064 and WLS7-0066. The verification test covers the evaluation of measured mean wind speeds, wind directions and wind speed standard deviations. The data analysis is basically performed in terms of different kinds of regression analyses.......The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7...

  18. Verification of consumers' experiences and perceptions of genetic discrimination and its impact on utilization of genetic testing. (United States)

    Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret


    To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.

  19. Air Force Research Laboratory Test and Evaluation, Verification and Validation of Autonomous Systems Challenge Exploration (United States)


    S and T&E. The results from these methods can be recorded in a modular fashion, enabling compositional verification of autonomous subcomponents at...Behcet Acikmese  UTEXAS  Darryl Ahner  AFIT  Nick Armstrong‐Crews  MIT  Dionisio de Niz  SEI/CMU  Georgios Fainekos  ASU   Karen Feigh  GATECH  Naira...problem is the upfront design process and the  system itself.  • Modularization  needs to be pushed more and further.  How will things connect and work

  20. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. The leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.

  1. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  2. Length of stay and economic consequences with rivaroxaban vs enoxaparin/vitamin K antagonist in patients with DVT and PE: findings from the North American EINSTEIN clinical trial program. (United States)

    Bookhart, Brahim K; Haskell, Lloyd; Bamber, Luke; Wang, Maria; Schein, Jeff; Mody, Samir H


    Venous thromboembolism (VTE) (deep vein thrombosis [DVT] and pulmonary embolism [(PE]) represents a substantial economic burden to the healthcare system. Using data from the randomized EINSTEIN DVT and PE trials, this North American sub-group analysis investigated the potential of rivaroxaban to reduce the length of initial hospitalization in patients with acute symptomatic DVT or PE. A post-hoc analysis of hospitalization and length-of-stay (LOS) data was conducted in the North American sub-set of patients from the randomized, open-label EINSTEIN trial program. Patients received either rivaroxaban (15 mg twice daily for 3 weeks followed by 20 mg once daily; n = 405) or dose-adjusted subcutaneous enoxaparin overlapping with (guideline-recommended 'bridging' therapy) and followed by a vitamin K antagonist (VKA) (international normalized ratio = 2.0-3.0; n = 401). The open-label study design allowed for the comparison of LOS between treatment arms under conditions reflecting normal clinical practice. LOS was evaluated using investigator records of dates of admission and discharge. Analyses were carried out in the intention-to-treat population using parametric tests. Costs were applied to the LOS based on weighted mean cost per day for DVT and PE diagnoses obtained from the Healthcare Cost and Utilization Project dataset. Of 382 patients hospitalized, 321 (84%), had acute symptomatic PE; few DVT patients required hospitalization. Similar rates of VTE patients were hospitalized in the rivaroxaban and enoxaparin/VKA treatment groups, 189/405 (47%) and 193/401 (48%), respectively. In hospitalized VTE patients, rivaroxaban treatment produced a 1.6-day mean reduction in LOS (median = 1 day) compared with enoxaparin/VKA (mean = 4.5 vs 6.1; median = 3 vs 4), translating to total costs that were $3419 lower in rivaroxaban-treated patients. In hospitalized North American patients with VTE, treatment with rivaroxaban produced a statistically

  3. Voltage verification unit (United States)

    Martin, Edward J [Virginia Beach, VA


    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  4. Ileofemoral Deep Vein Thrombosis (DVT) in Steroid Treated Lepra Type 2 Reaction Patient. (United States)

    Thangaraju, P; Giri, V C; Aravindan, U; Sajitha, V; Showkath Ali, M K


    In 1998 a 57-year-old man having skin leisons of 6 months duration reported to Central Leprosy Teaching and Research Institute (CLTRI), Chengalpattu. It was diagnosed as a case of borderline lepromatous leprosy with a type 2 lepra reaction, was treated with multi bacillary-multi drug therapy (MBMDT) for a period of 12 months and the patient was released from treatment (RFT) in September 1999. For reactions the patient was treated with prednisolone for more than 10 months. After 14 years in April 2013 the same patient presented to CLTRI with complaints of weakness of both hands with loss of sensation for 4 months, so making a diagnosis suggestive of MB relapse with neuritis the patient was started with MB-MDT for period of 12 months with initial prednisolone 25 mg OD dose then increased to 40 mg for painful swollen leg and to follow the neuritis associated pain and swelling. Increased dose is not beneficial and the patient was investigated for other pathology. Doppler ultra-sound revealed a left ileofemoral deep vein thrombosis (DVT) in that patient with levels. Prednisolone was withdrawn and the patient was started with anticoagulant heparin followed by warfarin. During this period rifampicin was also withdrawn. After patient was in good condition he was put on MB-MDT regimen. Till the 6th pulse the patient continues to show improvement in functions without steroids and any tenderness, he is taking multivitamins; regular physiotherapy. This DVT appears to be due to prednisolone and such causative relationship though rare should be kept in mind when patient on long term treatment with steroids/and or immobilized or on prolonged bed rest report with such symptomatology.

  5. Primary HPV testing verification: A retrospective ad-hoc analysis of screening algorithms on women doubly tested for cytology and HPV. (United States)

    Tracht, Jessica; Wrenn, Allison; Eltoum, Isam-Eldin


    To evaluate human papillomavirus (HPV) testing as a primary screening tool, we retrospectively analyzed data comparing (1) HPV testing to the algorithms of the ATHENA Study: (2) cytology alone, (3) cytology with ASCUS triage in women 25-29 and (4) cotesting ≥ 30 or (5) cotesting ≥ 25. We retrospectively analyzed data from women tested with both cytology and HPV testing from 2010 to 2013. Cumulative risk (CR) for CIN3+ was calculated. Crude and verification bias adjusted (VBA) sensitivity, specificity, predictive values, likelihood ratios, colposcopy rate, and screening test numbers were compared. About 15,173 women (25-95, 7.1% testing. Nearly 1,184 (8.4%) had biopsies. About 19.4% had positive cytology, 14.5% had positive HPV. HPV testing unassociated with ASCUS was requested in 40% of women testing per CIN3+ diagnosed. While HPV-/NILM cotesting results are associated with low CIN3+ risk, HPV testing had similar screening performance to cotesting and to cytology alone. Additionally, HPV testing and cytology incur false negatives in nonoverlapping subsets of patients. Diagn. Cytopathol. 2017;45:580-586. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Testing of the dual slab verification detector for attended measurements of the BN-350 dry storage casks

    Energy Technology Data Exchange (ETDEWEB)

    Santi, Peter A [Los Alamos National Laboratory; Browne, Michael C [Los Alamos National Laboratory; Williams, Richard B [Los Alamos National Laboratory; Parker, Robert F [Los Alamos National Laboratory


    The Dual Slab Verification Detector (DSVD) has been developed and built by Los Alamos National Laboratory in cooperation with the International Atomic Energy Agency (IAEA) as part of the dry storage safeguards system for the spent fuel from the BN-350 fast reactor. The detector consists of two rows of {sup 3}He tubes embedded in a slab of polyethylene which has been designed to be placed on the outer surface of the dry storage cask. The DSVD will be used to perform measurements of the neutron flux emanating from inside the dry storage cask at several locations around each cask to establish a neutron 'fingerprint' that is sensitive to the contents of the cask. The sensitivity of the fingerprinting technique to the removal of specific amount of nuclear material from the cask is determined by the characteristics of the detector that is used to perform the measurements, the characteristics of the spent fuel being measured, and systematic uncertainties that are associated with the dry storage scenario. MCNPX calculations of the BN-350 dry storage asks and layout have shown that the neutron fingerprint verification technique using measurements from the DSVD would be sensitive to both the amount and location of material that is present within an individual cask. To confirm the performance of the neutron fingerprint technique in verifying the presence of BN-350 spent fuel in dry storage, an initial series of measurements have been performed to test the performance and characteristics of the DSVD. Results of these measurements will be presented and compared with MCNPX results.

  7. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures (United States)


    IV&V Independent Verification and Validation JTRS Joint Tactical Radio System JVMF Joint Variable Message Format LDAP Lightweight Directory Access...Protocol LDIF LDAP Data Interchange Format LSI Lead Systems Integrator LUT Limited User Test MCS Mounted Combat System / Mobility Computer System

  8. State-of-the-art report for the testing and formal verification methods for FBD program

    Energy Technology Data Exchange (ETDEWEB)

    Jee, Eun Kyoung [KAIST, Daejeon (Korea, Republic of); Lee, Jang Soo; Lee, Young Jun [KAERI, Daejeon (Korea, Republic of); Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)


    The importance of PLC testing has increased in the nuclear I and C domain. While regulation authorities require both functional and structural testing for safety system software, FBD testing relies only on functional testing and there has been little research on structural testing techniques for FBD programs. We aim to analyze current techniques related to FBD testing and develop a structural testing technique appropriate to FBD programs. We developed structural test coverage criteria applicable to FBD programs, focusing on data paths from input edges to output edges of FBD programs. A data path condition (DPC), under which input data can flow into the output edge, is defined for each data path. We defined basic coverage, input condition coverage and complex condition coverage criteria based on the formal definition of DPC. We also developed a measurement procedure for FBD testing adequacy and a supporting tool prototype

  9. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes


    Kim, Jong-Bum; Jeong, Ji-Young; Lee, Tae-Ho; Kim, Sungkyun; Euh, Dong-Jin; Joo, Hyung-Kook


    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V&V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the co...

  10. Nist Microwave Blackbody: The Design, Testing, and Verification of a Conical Brightness Temperature Source (United States)

    Houtz, Derek Anderson

    maximized emissivity are fundamental to a well characterized blackbody. The chosen geometry is a microwave absorber coated copper cone. Electromagnetic and thermal simulations are introduced to optimize the design. Experimental verifications of the simulated quantities confirm the predicted performance of the blackbody.

  11. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines (United States)

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  12. Fabrication and characterization of gelatin-based test materials for verification of trace contraband vapor detectors. (United States)

    Staymates, Jessica L; Gillen, Greg


    This work describes a method to produce inexpensive and field deployable test materials that can be used to verify the performance of trace contraband vapor detection systems such as ion mobility spectrometers (IMS) currently deployed worldwide for explosives, narcotics, and chemical warfare agent (CWA) detection. Requirements for such field deployable test materials include long shelf life, portability, and low manufacturing costs. Reported here is a method for fabricating these test materials using encapsulation of high vapor pressure compounds, such as methyl salicylate (MS), into a gelatin matrix. Gelatin serves as a diffusion barrier allowing for controlled and sustained release of test vapors. Test materials were prepared by incorporating serial dilutions of MS into gelatin, which provide controlled analyte vapor release over 3 to 4 orders of magnitude of instrument response. The test materials are simple to prepare and have been shown to be stable for at least one year under controlled laboratory conditions.

  13. Review of waste package verification tests. Semiannual report, April 1984-September 1984. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Jain, H.; Veakis, E.; Soo, P.


    This ongoing study is part of a task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report includes crushed tuff packing material for use in a high level waste tuff repository. A review of available tests to quantify packing performance is given together with recommendations for future testing work. 27 refs., 6 figs., 3 tabs.


    The report gives results of testing three fuels in a small (732 kW) firetube package boiler to determine emissions of carbon monoxide (CO), nitrogen oxide (NO), particulate matter (PM), and total hydrocarbons (THCs). The tests were part of EPA's Environmental Technology Verificat...

  15. 76 FR 17287 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing (United States)


    ... emission testing body requirements) to improve the accuracy of emissions data. EPA is also amending other... rule, adding two new definitions, revising certain compliance dates, and clarifying the language and...-Compliant Air Emission Testing Body (AETB) Names C. Other Amendments 1. Compliance Dates for Units Adding...

  16. Verification of yield functions by biaxial tensile tests with rotated principal axes (United States)

    Ageba, Ryo; Ishiwtari, Akinobu; Hiramoto, Jiro


    A yield function is a critical factor contributing to the accuracy of FEM simulation of steel sheet forming. Yld2000-2d by Barlat is an anisotropic yield function for shell elements. Uniaxial and biaxial tensile test are required to identify the parameters of the Yld2000-2d function. In tests, the principal axes of stresses are normally either parallel or orthogonal to the rolling direction. However, the principal axes of stresses of the material are randomly oriented in actual press forming. Therefore the actual material behavior may not be correctly expressed by a yield function identified from tests always conducted with the same principal axes directions. In this study, the accuracy of the anisotropic yield function is verified under biaxial stress with different principal axes in tests using specimens with rotated principal axes. The results confirm that the accuracy of Yld2000-2d is adequate and the identifying tests are reasonable.

  17. Power Performance Verification of a Wind Farm Using the Friedman's Test

    National Research Council Canada - National Science Library

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L


    .... This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded...

  18. Ultrasonic Testing of Thick Walled Austenitic Welds: Modelling and Experimental Verification (United States)

    Köhler, B.; Müller, W.; Spies, M.; Schmitz, V.; Zimmer, A.; Langenberg, K.-J.; Mletzko, U.


    The testing of austenitic welds is difficult due to the elastically anisotropic properties of the weld grains. Therefore the normal rules for the selection of testing conditions as appropriate wave modes, frequencies and incident angles cannot be applied in the usual way. In last years several tools for simulation of the wave propagation in such testing situations were developed. In the paper these tools are applied to a austenitic weld containing a crack grown by intergranular stress corrosion cracking (IGSCC). It is demonstrated that by the combined application of several simulation tools a stepwise narrowing of the parameter space can be achieved. Eventually an optimized testing configuration is defined. The approach is validated experimentally.

  19. A Strategy for Automatic Quality Signing and Verification Processes for Hardware and Software Testing

    Directory of Open Access Journals (Sweden)

    Mohammed I. Younis


    Circuits in a production line. Comparatively, our result demonstrates that the proposed strategy outperforms the traditional block partitioning strategy with the mutant score of 100% to 90%, respectively, with the same number of test cases.

  20. Fatigue life prediction of liquid rocket engine combustor with subscale test verification (United States)

    Sung, In-Kyung

    Reusable rocket systems such as the Space Shuttle introduced a new era in propulsion system design for economic feasibility. Practical reusable systems require an order of magnitude increase in life. To achieve this improved methods are needed to assess failure mechanisms and to predict life cycles of rocket combustor. A general goal of the research was to demonstrate the use of subscale rocket combustor prototype in a cost-effective test program. Life limiting factors and metal behaviors under repeated loads were surveyed and reviewed. The life prediction theories are presented, with an emphasis on studies that used subscale test hardware for model validation. From this review, low cycle fatigue (LCF) and creep-fatigue interaction (ratcheting) were identified as the main life limiting factors of the combustor. Several life prediction methods such as conventional and advanced viscoplastic models were used to predict life cycle due to low cycle thermal stress, transient effects, and creep rupture damage. Creep-fatigue interaction and cyclic hardening were also investigated. A prediction method based on 2D beam theory was modified using 3D plate deformation theory to provide an extended prediction method. For experimental validation two small scale annular plug nozzle thrusters were designed, built and tested. The test article was composed of a water-cooled liner, plug annular nozzle and 200 psia precombustor that used decomposed hydrogen peroxide as the oxidizer and JP-8 as the fuel. The first combustor was tested cyclically at the Advanced Propellants and Combustion Laboratory at Purdue University. Testing was stopped after 140 cycles due to an unpredicted failure mechanism due to an increasing hot spot in the location where failure was predicted. A second combustor was designed to avoid the previous failure, however, it was over pressurized and deformed beyond repair during cold-flow test. The test results are discussed and compared to the analytical and numerical

  1. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications. (United States)

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W


    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy. © IMechE 2014.

  2. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN


    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  3. Environmental assessment of general-purpose heat source safety verification testing

    Energy Technology Data Exchange (ETDEWEB)



    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE`s mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned.

  4. Development and verification testing of automation and robotics for assembly of space structures (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.


    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  5. Verification of Emulated Channels in Multi-Probe Based MIMO OTA Testing Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum


    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  6. The test verification of 3D geodetic points and their changes

    Directory of Open Access Journals (Sweden)

    Vincent Jakub


    Full Text Available Approaches of congruency checks of 3D point field realisations applying repeated measurements. Investigation of 3D point displacement in various space direction using test procedures. Determination possibilities of 3D point movements and their significance by the confidence ellipsoids and their applications in practice.

  7. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    National Research Council Canada - National Science Library

    Hernandez, Wilmar; López-Presa, José; Maldonado-Correa, Jorge


    .... This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out...

  8. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  9. Verification of Overall Safety Factors In Deterministic Design Of Model Tested Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.


    The paper deals with concepts of safety implementation in design. An overall safety factor concept is evaluated on the basis of a reliability analysis of a model tested rubble mound breakwater with monolithic super structure. Also discussed are design load identification and failure mode limit st...

  10. Test/QA Plan for Verification of Coliform Detection Technologies for Drinking Water (United States)

    The coliform detection technologies to be tested use chromatogenic and fluorogenic growth media to detect coliforms and E. coli based on the enzymatic activity of these organisms. The systems consist of single-use sample containers that contain pre-measured reagents and can be u...

  11. Verification of nerve integrity after surgical intervention using quantitative sensory testing. (United States)

    Said-Yekta, Sareh; Smeets, Ralf; Esteves-Oliveira, Marcella; Stein, Jamal M; Riediger, Dieter; Lampert, Friedrich


    The aim of this study was to apply a standardized Quantitative Sensory Testing (QST) approach in patients to investigate whether oral surgery can lead to sensory changes, even if the patients do not report any sensory disturbances. Furthermore, this study determines the degree and duration of possible neuronal hyperexcitability due to local inflammatory trauma after oral surgery. Orofacial sensory functions were investigated by psychophysical means in 60 patients (30 male, 30 female) in innervation areas of infraorbital nerves, mental nerves and lingual nerves after different interventions in oral surgery. The patients were tested 1 week, 4 weeks, 7 weeks, and 10 weeks postoperatively. As controls for bilateral sensory changes after unilateral surgery, tests were additionally performed in 20 volunteers who did not have any dental restorations. No differences were found between the control group and the control side of the patients. Although not 1 of the patients reported paresthesia or other sensory changes postoperatively, QST detected significant differences between the control and the test side in the mental and lingual regions. Test sides were significantly less sensitive for thermal parameters (cold, warm, and heat). No differences were found in the infraorbital region. Patients showed significantly decreased pain pressure thresholds on the operated side. QST monitored recovery over time in all patients. The results show that oral surgery can lead to sensory deficits in the mental and lingual region, even if the patients do not notice any sensory disturbances. The applied QST battery is a useful tool to investigate trigeminal nerve function in the early postoperative period. In light of the increasing forensic implication, this tool can serve to objectify clinical findings. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Aerodynamics and performance verifications of test methods for laboratory fume cupboards. (United States)

    Tseng, Li-Ching; Huang, Rong Fung; Chen, Chih-Chieh; Chang, Cheng-Ping


    The laser-light-sheet-assisted smoke flow visualization technique is performed on a full-size, transparent, commercial grade chemical fume cupboard to diagnose the flow characteristics and to verify the validity of several current containment test methods. The visualized flow patterns identify the recirculation areas that would inevitably exist in the conventional fume cupboards because of the fundamental configurations and structures. The large-scale vortex structures exist around the side walls, the doorsill of the cupboard and in the vicinity of the near-wake region of the manikin. The identified recirculation areas are taken as the 'dangerous' regions where the risk of turbulent dispersion of contaminants may be high. Several existing tracer gas containment test methods (BS 7258:1994, prEN 14175-3:2003 and ANSI/ASHRAE 110:1995) are conducted to verify the effectiveness of these methods in detecting the contaminant leakage. By comparing the results of the flow visualization and the tracer gas tests, it is found that the local recirculation regions are more prone to contaminant leakage because of the complex interaction between the shear layers and the smoke movement through the mechanism of turbulent dispersion. From the point of view of aerodynamics, the present study verifies that the methodology of the prEN 14175-3:2003 protocol can produce more reliable and consistent results because it is based on the region-by-region measurement and encompasses the most area of the entire recirculation zone of the cupboard. A modified test method combined with the region-by-region approach at the presence of the manikin shows substantially different results of the containment. A better performance test method which can describe an operator's exposure and the correlation between flow characteristics and the contaminant leakage properties is therefore suggested.

  13. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications (United States)

    Zlotnik, V.A.; McGuire, V.L.


    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial

  14. Radiological examinations of the anatomy of the inferior turbinate using digital volume tomography (DVT). (United States)

    Balbach, L; Trinkel, V; Guldner, C; Bien, S; Teymoortash, A; Werner, J A; Bremke, M


    Since the last 120 years there were only few descriptions of the anatomical sizes of the inferior turbinate in the literature. On this background the current study should evaluate the radiological dimensions of the inferior turbinate and the septum using DVT. The latest generation of the Accu-I-tomo was used. The data of 100 adult patients have been evaluated. The bony length was found to be 38.9 mm, the mucosal length 51.0 mm. The findings of the total mucosal thickness at different measuring points were between 8.1 mm and 10.9 mm, those of the bony thickness were between 0.9 mm and 2.3 mm and those of the bony height were between 3.9 mm and 20.8 mm. The results of this radiological study are able to point out the importance of preoperative anatomical evaluation of radiological images. The preoperative focus on the individual anatomy is very important because of the choice of an adequate surgical treatment. Today new radiological techniques can help to find out whether the reason for hypertrophied turbinates is caused by bone, mucosa or both. This knowledge enables a concerted treatment concept.

  15. Automated particulate sampler for Comprehensive Test Ban Treaty verification (the DOE radionuclide aerosol sampler/analyzer) (United States)

    Bowyer, S. M.; Miley, H. S.; Thompson, R. C.; Hubbard, C. W.


    The Comprehensive Test Ban Treaty (CTBT) was recently signed by President Clinton and is intended to eliminate all nuclear weapons testing. One way which the treaty seeks to accomplish this is by the establishment of the International Monitoring System. As stated in the latest Working Papers of the Draft CTBT, "The International Monitoring System shall comprise facilities for seismological monitoring, radionuclide monitoring including certified laboratories, hydroacoustic monitoring, infrasound monitoring, and respective means of communication, and shall be supported by the International Data Centre of the Technical Secretariat". Radionuclide monitoring consists of both radionuclides associated with particulates and relevant noble gases. This type of monitoring is quite valuable since indications of a nuclear test in the form of radioactive particulate or radioactive noble gases may be detected at great distances from the detonation site. The system presented here is concerned only with radioactive particulate monitoring and is described as an automated sampler/analyzer which has been developed for the Department of Energy (DoE) at the Pacific Northwest National Laboratory (PNNL).

  16. Design of a Portable Test Facility for the ATLAS Tile Calorimeter Front-End Electronics Verification

    CERN Document Server

    Kim, HY; The ATLAS collaboration; Carrio, F; Moreno, P; Masike, T; Reed, R; Sandrock, C; Schettino, V; Shalyugin, A; Solans, C; Souza, J; Suter, R; Usai, G; Valero, A


    An FPGA-based motherboard with an embedded hardware processor is used to implement a portable test- bench for the full certification of Tile Calorimeter front-end electronics in the ATLAS experiment at CERN. This upgrade will also allow testing future versions of the TileCal read-out electronics as well. Because of its lightness the new facility is highly portable, allowing on-detector validation using sophisticated algorithms. The new system comprises a front-end GUI running on an external portable computer which controls the motherboard. It also includes several dedicated daughter-boards that exercise the different specialized functionalities of the system. Apart from being used to evaluate different technologies for the future upgrades, it will be used to certify the consolidation of the electronics by identifying low frequency failures. The results of the tests presented here show that new system is well suited for the 2013 ATLAS Long Shutdown. We discuss all requirements necessary to give full confidence...

  17. Cross-sectional imaging in dentomaxillofacial diagnostics: dose comparison of dental MSCT and NewTom {sup registered} 9000 DVT; Schnittbildverfahren zur dentomaxillofazialen Diagnostik: Dosisvergleich von Dental-MSCT und NewTom {sup registered} 9000 DVT

    Energy Technology Data Exchange (ETDEWEB)

    Coppenrath, E.; Meindl, T.; Reiser, M.; Mueller-Lisse, U. [Inst. fuer Klinische Radiologie, Ludwig-Maximilians-Univ. Muenchen (Germany); Draenert, F. [Mund- Kiefer- und Gesichtschirurgie, Ludwig-Maximilians-Univ. Muenchen (Germany); Lechel, U.; Veit, R. [Bundesamt fuer Strahlenschutz, Neuherberg (Germany)


    Purpose: for nonsuperimposed and three-dimensional imaging of jaws and teeth, multislice computer tomography (MSCT) can be performed, or alternatively digital volume tomography as a cone beam technique can be applied. The radiation dose of both procedures should be evaluated with different methods of dose assessment. Materials and methods: A4-row MSCT (Volume Zoom Siemens {sup registered}) and a cone beam CT (NewTom QR-DVT 9000 {sup registered}) were compared regarding the radiation exposure of the patient during a dental examination. Organ dose and effective dose were estimated by thermoluminescence dosimetry (TLD) using an Alderson-Rando phantom for both devices. In addition the effective dose of MSCT was calculated from the CTDI{sub vol}-value at scanner display and by CT-Expo {sup registered} program. Results: the effective dose of MSCT was 0.33 mSv for women (w) and 0.32 mSv for men (m) measured with TLD in the Alderson-Rando phantom, 0.39/0.35 mSv (w/m) by CTDI calculation and 0.39/0.33 mSv by CT-Expo {sup registered} program. The effective dose of NewTom {sup registered} QR-DVT 9000 from TLD measurement was 0.095/0.093 mSv (w/m). Conclusion: the radiation exposure of a typical dental examination with a NewTom {sup registered} cone beam DVT is about one third of the MSCT dose. Both techniques, however, moderate patient doses. Dosimetry methods as routinely used for MSCT cannot be applied to cone beam DVT. (orig.)

  18. Quantitative ultrasonic testing of acoustically anisotropic materials with verification on austenitic and dissimilar weld joints (United States)

    Boller, C.; Pudovikov, S.; Bulavinov, A.


    Austenitic stainless steel materials are widely used in a variety of industry sectors. In particular, the material is qualified to meet the design criteria of high quality in safety related applications. For example, the primary loop of the most of the nuclear power plants in the world, due to high durability and corrosion resistance, is made of this material. Certain operating conditions may cause a range of changes in the integrity of the component, and therefore require nondestructive testing at reasonable intervals. These in-service inspections are often performed using ultrasonic techniques, in particular when cracking is of specific concern. However, the coarse, dendritic grain structure of the weld material, formed during the welding process, is extreme and unpredictably anisotropic. Such structure is no longer direction-independent to the ultrasonic wave propagation; therefore, the ultrasonic beam deflects and redirects and the wave front becomes distorted. Thus, the use of conventional ultrasonic testing techniques using fixed beam angles is very limited and the application of ultrasonic Phased Array techniques becomes desirable. The "Sampling Phased Array" technique, invented and developed by Fraunhofer IZFP, allows the acquisition of time signals (A-scans) for each individual transducer element of the array along with fast image reconstruction techniques based on synthetic focusing algorithms. The reconstruction considers the sound propagation from each image pixel to the individual sensor element. For anisotropic media, where the sound beam is deflected and the sound path is not known a-priori, a novel phase adjustment technique called "Reverse Phase Matching" is implemented. By taking into account the anisotropy and inhomogeneity of the weld structure, a ray tracing algorithm for modeling the acoustic wave propagation and calculating the sound propagation time is applied. This technique can be utilized for 2D and 3D real time image reconstruction. The

  19. Raven's progressive matrices test: scale construction and verification of "Flynn effect"


    Lopetegui, María Susana; Neer, Rosa Haydée; Rossi Casé, Lilia Elba


    In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the si...

  20. [Paracoagulation tests in laboratory verification of intravascular coagulation. Comparative clinical evaluation of the SDPS test and of the ethanol test with serial dilution (SD ethanol test)]. (United States)

    Doni, A; De Simonis, S E; Pasquale, G


    The AA. remark that laboratory diagnosis of intravascular coagulation is a still alive methodological problem. The presence in the flowing blood of certain molecules (fdp) that derive from that alteration of fibrinogen metabolism, that is at the basis of any intravascular coagulation syndrome, can be detected by immunological methods. Paracoagulation tests allow on the contrary to detect fibrin monomers and fibrin degradation products (fdp) that are still clottable (nonenzymatic coagulation). The positivity of these ones constitutes therefore an indirect evidence of the effects of thrombin on fibrinogen and keeps us nearer to the pathogenetic moment which is the real basis of any I.C. syndrome. The AA. make here a parallel between the two principal paracoagulation tests that are S.D.P.S. test according to GUREWICH (1971 [7]) and Ethanol Gelation test according to GODAL (1966 [6]) whose details are given. These two tests were performed in parallel in 589 cases which showed clinical evidence or suspicion of I.C. mostly supported by the further clinical courses of the patients. The two tests are both positive only in 164 cases (27,8%); both non-positive in 109 cases (18,6%); S.D.P.S. test is positive alone in 306 cases (51,9%); Godal's test is positive alone in 10 cases (1,7%). These data, supported by clinical course, allow the AA. to think that S.D.P.S. test is more sensitive than Godal's test. Although the AA. make the hypothesis that S.D.P.S. test may be more sensitive owing to its serial diluitions method that allows it to achieve an optimal ratio between the paracoagulant agent and the molecules which are capable to be clotted (paracoagulated). So they modify Godal's test applying to it too the principle of serial diluitions. The details are given. This new Serial Diluition Ethanol Gelation test is therefore performed in parallel with S.D.P.S. test in 314 cases which showed clinical evidence or suspicion of I.C. mostly supported by further clinical courses of

  1. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Steed, Chad A [ORNL; Pullum, Laura L [ORNL


    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we build a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.

  2. Review of waste package verification tests. Semiannual report, October 1984-March 1985

    Energy Technology Data Exchange (ETDEWEB)

    Soo, P. (ed.)


    The potential of WAPPA, a second-generation waste package system code, to meet the needs of the regulatory community is analyzed. The analysis includes an indepth review of WAPPA`s individual process models and a review of WAPPA`s operation. It is concluded that the code is of limited use to the NRC in the present form. Recommendations for future improvement, usage, and implementation of the code are given. This report also describes the results of a testing program undertaken to determine the chemical environment that will be present near a high-level waste package emplaced in a basalt repository. For this purpose, low carbon 1020 steel (a current BWIP reference container material), synthetic basaltic groundwater and a mixture of bentonite and basalt were exposed, in an autoclave, to expected conditions some period after repository sealing (150{sup 0}C, {approx_equal}10.4 MPa). Parameters measured include changes in gas pressure with time and gas composition, variation in dissolved oxygen (DO), pH and certain ionic concentrations of water in the packing material across an imposed thermal gradient, mineralogic alteration of the basalt/bentonite mixture, and carbon steel corrosion behavior. A second testing program was also initiated to check the likelihood of stress corrosion cracking of austenitic stainless steels and Incoloy 825 which are being considered for use as waste container materials in the tuff repository program. 82 refs., 70 figs., 27 tabs.

  3. Verification of Ares I Liftoff Acoustic Environments via the Ares I Scale Model Acoustic Test (United States)

    Counter, Douglas D.; Houston, Janice D.


    Launch environments, such as Liftoff Acoustic (LOA) and Ignition Overpressure (IOP), are important design factors for any vehicle and are dependent upon the design of both the vehicle and the ground systems. The NASA Constellation Program had several risks to the development of the Ares I vehicle linked to LOA which are used in the development of the vibro-acoustic environments. The risks included cost, schedule and technical impacts for component qualification due to high predicted vibro-acoustic environments. One solution is to mitigate the environment at the component level. However, where the environment is too severe to mitigate at the component level, reduction of the launch environments is required. The Ares I Scale Model Acoustic Test (ASMAT) program was implemented to verify the predicted Ares I launch environments and to determine the acoustic reduction for the LOA environment with an above deck water sound suppression system. The test article included a 5% scale Ares I vehicle model, tower and Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments. The ASMAT results are compared to the Ares I LOA predictions and water suppression effectiveness results are presented.

  4. Verification of Ares I Liftoff Acoustic Environments via the Ares Scale Model Acoustic Test (United States)

    Counter, Douglas D.; Houston, Janice D.


    Launch environments, such as Liftoff Acoustic (LOA) and Ignition Overpressure (IOP), are important design factors for any vehicle and are dependent upon the design of both the vehicle and the ground systems. The NASA Constellation Program had several risks to the development of the Ares I vehicle linked to LOA which are used in the development of the vibro-acoustic environments. The risks included cost, schedule and technical impacts for component qualification due to high predicted vibro-acoustic environments. One solution is to mitigate the environment at the component level. However, where the environment is too severe to mitigate at the component level, reduction of the launch environments is required. The Ares I Scale Model Acoustic Test (ASMAT) program was implemented to verify the predicted Ares I launch environments and to determine the acoustic reduction for the LOA environment with an above deck water sound suppression system. The test article included a 5% scale Ares I vehicle model, tower and Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments. The ASMAT results are compared to the Ares I LOA predictions and water suppression effectiveness results are presented.


    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.


    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  6. Verification test for helium panel of cryopump for DIII-D advanced divertor

    Energy Technology Data Exchange (ETDEWEB)

    Baxi, C.B.; Laughon, G.J.; Langhorn, A.R.; Schaubel, K.M.; Smith, J.P.; Gootgeld, A.M.; Campbell, G.L. (General Atomics, San Diego, CA (United States)); Menon, M.M. (Oak Ridge National Lab., TN (United States))


    It is planned to install a cryogenic pump in the lower divertor portion of the D3-D tokamak with a pumping speed of 50000{ell}/s and an exhaust of 2670 Pa-{ell}/s (20 Torr-{ell}s). A coaxial counter flow configuration has been chosen for the helium panel of this cryogenic pump. This paper evaluates cooldown rates and fluid stability of this configuration. A prototypic test was performed at General Atomics (GA) to increase confidence in the design. It was concluded that the helium panel cooldown rate agreed quite well with analytical prediction and was within acceptable limits. The design flow rate proved stable and two-phase pressure drop can be predicted quite accurately. 8 refs., 5 figs., 1 tab.

  7. Adaptive support for aircraft panel testing: New method and its experimental verification on a beam structure (United States)

    Sachau, Delf; Baschke, Manuel


    Acoustic transmissibility of aircraft panels is measured in full-scale test rigs. The panels are supported at their frames. These boundary conditions do not take into account the dynamic influence of the fuselage, which is significant in the frequency range below 300 Hz. This paper introduces a new adaptive boundary system (ABS). It combines accelerometers and electrodynamic shakers with real-time signal processing. The ABS considers the dynamic effect of the fuselage on the panel. The frames are dominating the dynamic behaviour of a fuselage in the low-frequency range. Therefore, the new method is applied to a beam representing a frame of the aircraft structure. The experimental results are evaluated and the precision of the ABS is discussed. The theoretical apparent mass representing the cut-off part of a frame is calculated and compared with the apparent mass, as provided by the ABS. It is explained how the experimental set-up limits the precision of the ABS.

  8. Mass-additive modal test method for verification of constrained structural models (United States)

    Admire, John R.; Tinker, Michael L.; Ivey, Edward W.


    A method for deriving constrained or fixed-base modes and frequencies from free-free modes of a structure with mass-loaded boundaries is developed. Problems associated with design and development of test fixtures can be avoided with such an approach. The analytical methodology presented is used to assess applicability of the mass-additive method for three types of structures and to determine the accuracy of derived constrained modes and frequencies. Results show that mass loading of the boundaries enables local interface modes to be measured within a desired frequency bandwidth, thus allowing constrained modes to be derived with considerably fewer free-free modes than for unloaded boundaries. Good convergence was obtained for a simple beam and a truss-like Shuttle payload, both of which had well-spaced modes and stiff interface support structures. Slow convergence was obtained for a space station module prototype, a shell-like structure having high modal density.

  9. Verification of a Proposed Clinical Electroacoustic Test Protocol for Personal Digital Modulation Receivers Coupled to Cochlear Implant Sound Processors. (United States)

    Nair, Erika L; Sousa, Rhonda; Wannagot, Shannon

    Guidelines established by the AAA currently recommend behavioral testing when fitting frequency modulated (FM) systems to individuals with cochlear implants (CIs). A protocol for completing electroacoustic measures has not yet been validated for personal FM systems or digital modulation (DM) systems coupled to CI sound processors. In response, some professionals have used or altered the AAA electroacoustic verification steps for fitting FM systems to hearing aids when fitting FM systems to CI sound processors. More recently steps were outlined in a proposed protocol. The purpose of this research is to review and compare the electroacoustic test measures outlined in a 2013 article by Schafer and colleagues in the Journal of the American Academy of Audiology titled "A Proposed Electroacoustic Test Protocol for Personal FM Receivers Coupled to Cochlear Implant Sound Processors" to the AAA electroacoustic verification steps for fitting FM systems to hearing aids when fitting DM systems to CI users. Electroacoustic measures were conducted on 71 CI sound processors and Phonak Roger DM systems using a proposed protocol and an adapted AAA protocol. Phonak's recommended default receiver gain setting was used for each CI sound processor manufacturer and adjusted if necessary to achieve transparency. Electroacoustic measures were conducted on Cochlear and Advanced Bionics (AB) sound processors. In this study, 28 Cochlear Nucleus 5/CP810 sound processors, 26 Cochlear Nucleus 6/CP910 sound processors, and 17 AB Naida CI Q70 sound processors were coupled in various combinations to Phonak Roger DM dedicated receivers (25 Phonak Roger 14 receivers-Cochlear dedicated receiver-and 9 Phonak Roger 17 receivers-AB dedicated receiver) and 20 Phonak Roger Inspiro transmitters. Employing both the AAA and the Schafer et al protocols, electroacoustic measurements were conducted with the Audioscan Verifit in a clinical setting on 71 CI sound processors and Phonak Roger DM systems to

  10. [Verification of the double dissociation model of shyness using the implicit association test]. (United States)

    Fujii, Tsutomu; Aikawa, Atsushi


    The "double dissociation model" of shyness proposed by Asendorpf, Banse, and Mtücke (2002) was demonstrated in Japan by Aikawa and Fujii (2011). However, the generalizability of the double dissociation model of shyness was uncertain. The present study examined whether the results reported in Aikawa and Fujii (2011) would be replicated. In Study 1, college students (n = 91) completed explicit self-ratings of shyness and other personality scales. In Study 2, forty-eight participants completed IAT (Implicit Association Test) for shyness, and their friends (n = 141) rated those participants on various personality scales. The results revealed that only the explicit self-concept ratings predicted other-rated low praise-seeking behavior, sociable behavior and high rejection-avoidance behavior (controlled shy behavior). Only the implicit self-concept measured by the shyness IAT predicted other-rated high interpersonal tension (spontaneous shy behavior). The results of this study are similar to the findings of the previous research, which supports generalizability of the double dissociation model of shyness.

  11. SE and I system testability: The key to space system FDIR and verification testing (United States)

    Barry, Thomas; Scheffer, Terrance; Small, Lynn R.; Monis, Richard


    The key to implementing self-diagnosing design is a systems engineering task focused on design for testability concurrent with design for functionality. The design for testability process described here is the product of several years of DOD study and experience. Its application to the space station has begun on Work Package II under NASA and McDonnell direction. Other work package teams are being briefed by Harris Corporation with the hope of convincing them to embrace the process. For the purpose of this discussion the term testability is used to describe the systems engineering process by which designers can assure themselves and their reviewers that their designs are TESTABLE, that is they will support the downstream process of determining their functionality. Due to the complexity and density of present-day state-of-the-art designs, such as pipeline processors and high-speed integrated circuit technology, testability feature design is a critical requirement of the functional design process. A systematic approach to Space systems test and checkout as well as fault detection fault isolation reconfiguration (FDFIR) will minimize operational costs and maximize operational efficiency. An effective design for the testability program must be implemented by all contractors to insure meeting this objective. The process is well understood and technology is here to support it.

  12. Verification Test of the SURF and SURFplus Models in xRage: Part II

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The previous study used an underdriven detonation wave (steady ZND reaction zone profile followed by a scale invariant rarefaction wave) for PBX 9502 as a validation test of the implementation of the SURF and SURFplus models in the xRage code. Even with a fairly fine uniform mesh (12,800 cells for 100mm) the detonation wave profile had limited resolution due to the thin reaction zone width (0.18mm) for the fast SURF burn rate. Here we study the effect of finer resolution by comparing results of simulations with cell sizes of 8, 2 and 1 μm, which corresponds to 25, 100 and 200 points within the reaction zone. With finer resolution the lead shock pressure is closer to the von Neumann spike pressure, and there is less noise in the rarefaction wave due to fluctuations within the reaction zone. As a result the average error decreases. The pointwise error is still dominated by the smearing the pressure kink in the vicinity of the sonic point which occurs at the end of the reaction zone.

  13. Test-retest reliability of probe-microphone verification in children fitted with open and closed hearing aid tips. (United States)

    Kim, Hannah; Ricketts, Todd A


    To investigate the test-retest reliability of real-ear aided response (REAR) measures in open and closed hearing aid fittings in children using appropriate probe-microphone calibration techniques (stored equalization for open fittings and concurrent equalization for closed fittings). Probe-microphone measurements were completed for two mini-behind-the-ear (BTE) hearing aids which were coupled to the ear using open and closed eartips via thin (0.9 mm) tubing. Before probe-microphone testing, the gain of each of the test hearing aids was programmed using an artificial ear simulator (IEC 711) and a Knowles Electronic Manikin for Acoustic Research to match the National Acoustic Laboratories-Non-Linear, version 1 targets for one of two separate hearing loss configurations using an Audioscan Verifit. No further adjustments were made, and the same amplifier gain was used within each hearing aid across both eartip configurations and all participants. Probe-microphone testing included real-ear occluded response (REOR) and REAR measures using the Verifit's standard speech signal (the carrot passage) presented at 65 dB sound pressure level (SPL). Two repeated probe-microphone measures were made for each participant with the probe-tube and hearing aid removed and repositioned between each trial in order to assess intrasubject measurement variability. These procedures were repeated using both open and closed domes. Thirty-two children, ages ranging from 4 to 14 yr. The test-retest standard deviations for open and closed measures did not exceed 4 dB at any frequency. There was also no significant difference between the open (stored equalization) and closed (concurrent equalization) methods. Reliability was particularly similar in the high frequencies and was also quite similar to that reported in previous research. There was no correlation between reliability and age, suggesting high reliability across all ages evaluated. The findings from this study suggest that reliable probe

  14. Addressing the sociotechnical drivers of quality improvement: a case study of post-operative DVT prophylaxis computerised decision support. (United States)

    Lesselroth, Blake J; Yang, Jianji; McConnachie, Judy; Brenk, Thomas; Winterbottom, Lisa


    Quality improvement (QI) initiatives characterised by iterative cycles of quantitative data analysis do not readily explain the organisational determinants of change. However, the integration of sociotechnical theory can inform more effective strategies. Our specific aims were to (1) describe a computerised decision support intervention intended to improve adherence with deep venous thrombosis (DVT) prophylaxis recommendations; and (2) show how sociotechnical theory expressed in 'Fit between Individuals, Task and Technology' framework (FITT) can identify and clarify the facilitators and barriers to QI work. A multidisciplinary team developed and implemented electronic menus with DVT prophylaxis recommendations. Stakeholders were interviewed and human factors were analysed to optimise integration. Menu exposure, order placement and clinical performance were measured. Vista tool extraction and chart review were used. Performance compliance pre-implementation was 77%. There were 80-110 eligible cases per month. Initial menu use rate was 20%. After barriers were classified and addressed using the FITT framework, use improved 50% to 90%. Tasks, users and technology issues in the FITT model and their interfaces were identified and addressed. Workflow styles, concerns about validity of guidelines, cycle times and perceived ambiguity of risk were issues identified. DVT prophylaxis in a surgical setting is fraught with socio-political agendas, cognitive dissonance and misaligned expectations. These must be sought and articulated if organisations are to respond to internal resistance to change. This case study demonstrates that QI teams using information technology must understand the clinical context, even in mature electronic health record environments, in order to implement sustainable systems.

  15. Verification of Ceramic Structures (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit


    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  16. Laboratory testing and performance verification of the CHARIS integral field spectrograph (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; Jovanovic, Nemanja; McElwain, Michael W.; Takato, Naruhisa; Hayashi, Masahiko


    delivered to the Subaru telescope in April 2016. This paper is a report on the laboratory performance of the spectrograph, and its current status in the commissioning process so that observers will better understand the instrument capabilities. We will also discuss the lessons learned during the testing process and their impact on future high-contrast imaging spectrographs for wavefront control.

  17. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)


    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  18. Predictive permeability model of faults in crystalline rocks; verification by joint hydraulic factor (JH) obtained from water pressure tests (United States)

    Barani, Hamidreza Rostami; Lashkaripour, Gholamreza; Ghafoori, Mohammad


    In the present study, a new model is proposed to predict the permeability per fracture in the fault zones by a new parameter named joint hydraulic factor (JH). JH is obtained from Water Pressure Test (WPT) and modified by the degree of fracturing. The results of JH correspond with quantitative fault zone descriptions, qualitative fracture, and fault rock properties. In this respect, a case study was done based on the data collected from Seyahoo dam site located in the east of Iran to provide the permeability prediction model of fault zone structures. Datasets including scan-lines, drill cores, and water pressure tests in the terrain of Andesite and Basalt rocks were used to analyse the variability of in-site relative permeability of a range from fault zones to host rocks. The rock mass joint permeability quality, therefore, is defined by the JH. JH data analysis showed that the background sub-zone had commonly fracture, whereas the fault core had permeability characteristics nearly as low as the outer damage zone, represented by 8 Lu (1.3 ×10-4 m 3/s) per fracture, with occasional peaks towards 12 Lu (2 ×10-4 m 3/s) per fracture. The maximum JH value belongs to the inner damage zone, marginal to the fault core, with 14-22 Lu (2.3 ×10-4-3.6 ×10-4 m 3/s) per fracture, locally exceeding 25 Lu (4.1 ×10-4 m 3/s) per fracture. This gives a proportional relationship for JH approximately 1:4:2 between the fault core, inner damage zone, and outer damage zone of extensional fault zones in crystalline rocks. The results of the verification exercise revealed that the new approach would be efficient and that the JH parameter is a reliable scale for the fracture permeability change. It can be concluded that using short duration hydraulic tests (WPTs) and fracture frequency (FF) to calculate the JH parameter provides a possibility to describe a complex situation and compare, discuss, and weigh the hydraulic quality to make predictions as to the permeability models and

  19. Testing and Demonstrating Speaker Verification Technology in Iraqi-Arabic as Part of the Iraqi Enrollment Via Voice Authentication Project (IEVAP) in Support of the Global War on Terrorism (GWOT)

    National Research Council Canada - National Science Library

    Withee, Jeffrey W; Pena, Edwin D


    This thesis documents the findings of an Iraqi-Arabic language test and concept of operations for speaker verification technology as part of the Iraqi Banking System in support of the Iraqi Enrollment...


    Energy Technology Data Exchange (ETDEWEB)



    Active well coincidence counter assays have been performed on uranium metal highly enriched in {sup 235}U. The data obtained in the present program, together with highly enriched uranium (HEU) metal data obtained in other programs, have been analyzed using two approaches, the standard approach and an alternative approach developed at BNL. Analysis of the data with the standard approach revealed that the form of the relationship between the measured reals and the {sup 235}U mass varied, being sometimes linear and sometimes a second-order polynomial. In contrast, application of the BNL algorithm, which takes into consideration the totals, consistently yielded linear relationships between the totals-corrected reals and the {sup 235}U mass. The constants in these linear relationships varied with geometric configuration and level of enrichment. This indicates that, when the BNL algorithm is used, calibration curves can be established with fewer data points and with more certainty than if a standard algorithm is used. However, this potential advantage has only been established for assays of HEU metal. In addition, the method is sensitive to the stability of natural background in the measurement facility.


    EPA‘s Environmental Technology Verification program is designed to further environmental protection by accelerating the acceptance and use of improved and cost effective technologies. This is done by providing high-quality, peer reviewed data on technology performance to those in...

  2. Safety of a DVT chemoprophylaxis protocol following traumatic brain injury: a single center quality improvement initiative. (United States)

    Nickele, Christopher M; Kamps, Timothy K; Medow, Joshua E


    significant deep venous thrombosis (DVT) was 6.9 % (6 of 87). Three protocol patients (3.45 %) went to the operating room for surgery after the initiation of PTP; none of these patients had a measurable change in hemorrhage size on head CT. The change in percentage of patients receiving PTP was significantly increased by the protocol (p behavior, increasing the percentage of patients receiving PTP during their hospitalization; whether long-term patient outcomes are affected is a potential goal for future study.

  3. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft (United States)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John


    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  4. Verification Games: Crowd-Sourced Formal Verification (United States)



  5. Density conversion factor determined using a cone-beam computed tomography unit NewTom QR-DVT 9000. (United States)

    Lagravère, M O; Fang, Y; Carey, J; Toogood, R W; Packota, G V; Major, P W


    The purpose of this study was to determine a conversion coefficient for Hounsfield Units (HU) to material density (g cm(-3)) obtained from cone-beam computed tomography (CBCT-NewTom QR-DVT 9000) data. Six cylindrical models of materials with different densities were made and scanned using the NewTom QR-DVT 9000 Volume Scanner. The raw data were converted into DICOM format and analysed using Merge eFilm and AMIRA to determine the HU of different areas of the models. There was no significant difference (P = 0.846) between the HU given by each piece of software. A linear regression was performed using the density, rho (g cm(-3)), as the dependent variable in terms of the HU (H). The regression equation obtained was rho = 0.002H-0.381 with an R2 value of 0.986. The standard error of the estimation is 27.104 HU in the case of the Hounsfield Units and 0.064 g cm(-3) in the case of density. CBCT provides an effective option for determination of material density expressed as Hounsfield Units.

  6. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  7. The construction of environments for development of test and verification technology -The development of advanced instrumentation and control technology-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Shick; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Kim, Jae Hee; Lee, Chang Soo [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)


    Several problems were identified in digitalizing the I and C systems of NPPs. A scheme is divided into hardware and software to resolve these problems. Hardware verification and validation analyzed about common mode failure, commercial grade dedication process, electromagnetic competibility. We have reviewed codes and standards to be a consensus criteria among vendor, licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 in the United States Nuclear Regulatory Commision (NRC) and presented vendor`s approaches to scope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. (Author).

  8. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin


    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2 (United States)

    Platt, R.


    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  10. Automatic Verification of Autonomous Robot Missions (United States)


    for a mission related to the search for a biohazard. Keywords: mobile robots, formal verification , performance guarantees, automatic translation 1...tested. 2 Related Work Formal verification of systems is critical when failure creates a high cost, such as life or death scenarios. A variety of...robot. 3.3 PARS Process algebras are specification languages that allow for formal verification of concurrent systems. Process Algebra for Robot

  11. Verification-based Software-fault Detection


    Gladisch, Christoph David


    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  12. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim


    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  13. A plan for application system verification tests: The value of improved meteorological information, volume 1. [economic consequences of improved meteorological information (United States)


    The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.

  14. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A. [comps.] [Oak Ridge National Lab., TN (United States)


    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasis was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.

  15. Editor's Choice - Factors Associated with Long-Term Outcome in 191 Patients with Ilio-Femoral DVT Treated With Catheter-Directed Thrombolysis

    DEFF Research Database (Denmark)

    Foegh, P; Jensen, L P; Klitfod, L


    consecutive patients (203 limbs) attending a tertiary vascular centre at Gentofte University Hospital, Denmark underwent CDT. All patients had ultrasonically verified acute ilio-femoral DVT with open distal popliteal vein and calf veins. Patients were seen in the outpatient clinic 6 weeks, 3, 6, and 12 months...

  16. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta


    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  17. Complete Functional Verification


    Bormann, Joerg (Dr.)


    The dissertation describes a practically proven, particularly efficient approach for the verification of digital circuit designs. The approach outperforms simulation based verification wrt. final circuit quality as well as wrt. required verification effort. In the dissertation, the paradigm of transaction based verification is ported from simulation to formal verification. One consequence is a particular format of formal properties, called operation properties. Circuit descriptions are verifi...

  18. Rivaroxaban for the treatment of symptomatic deep-vein thrombosis and pulmonary embolism in Chinese patients: a subgroup analysis of the EINSTEIN DVT and PE studies. (United States)

    Wang, Yuqi; Wang, Chen; Chen, Zhong; Zhang, Jiwei; Liu, Zhihong; Jin, Bi; Ying, Kejing; Liu, Changwei; Shao, Yuxia; Jing, Zhicheng; Meng, Isabelle Ling; Prins, Martin H; Pap, Akos F; Müller, Katharina; Lensing, Anthonie Wa


    The worldwide EINSTEIN DVT and EINSTEIN PE studies randomized 8282 patients with acute symptomatic deep-vein thrombosis (DVT) and/or pulmonary embolism (PE) and, for the first time in trials in this setting, included patients in China. This analysis evaluates the results of these studies in this subgroup of patients. A total of 439 Chinese patients who had acute symptomatic DVT (n=211), or PE with or without DVT (n=228), were randomized to receive rivaroxaban (15 mg twice daily for 21 days, followed by 20 mg once daily) or standard therapy of enoxaparin overlapping with and followed by an adjusted-dose vitamin K antagonist, for 3, 6, or 12 months. The primary efficacy outcome was symptomatic recurrent venous thromboembolism. The principal safety outcome was major or non-major clinically relevant bleeding. The primary efficacy outcome occurred in seven (3.2%) of the 220 patients in the rivaroxaban group and in seven (3.2%) of the 219 patients in the standard-therapy group (hazard ratio, 1.04; 95% confidence interval 0.36-3.0; p=0.94). The principal safety outcome occurred in 13 (5.9%) patients in the rivaroxaban group and in 20 (9.2%) patients in the standard-therapy group (hazard ratio, 0.63; 95% confidence interval 0.31-1.26; p=0.19). Major bleeding was observed in no patients in the rivaroxaban group and in five (2.3%) patients in the standard-therapy group. In fragile patients (defined as age >75 years, creatinine clearance EINSTEIN PE, NCT00439777; EINSTEIN DVT, NCT00440193.

  19. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS) (United States)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond


    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or

  20. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1 (United States)

    Platt, R.


    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.


    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  2. Standard practice for verification of testing frame and specimen alignment under tensile and compressive axial force application

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 Included in this practice are methods covering the determination of the amount of bending that occurs during the application of tensile and compressive forces to notched and unnotched test specimens in the elastic range and to plastic strains less than 0.002. These methods are particularly applicable to the force application rates normally used for tension testing, creep testing, and uniaxial fatigue testing.

  3. Standard practice for verification of testing frame and specimen alignment under tensile and compressive axial force application

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 Included in this practice are methods covering the determination of the amount of bending that occurs during the application of tensile and compressive forces to notched and unnotched test specimens in the elastic range and to plastic strains less than 0.002. These methods are particularly applicable to the force application rates normally used for tension testing, creep testing, and uniaxial fatigue testing.

  4. Measurements of the local dose with regard to digital volume tomography (DVT) for the purpose of assessing deviations between phantom and human anatomy; Ortsdosismessungen an einer digitalen Volumen-Tomographieeinrichtung (DVT) hinsichtlich der Unterschiede zwischen einem Phantom und der menschlichen Anatomie

    Energy Technology Data Exchange (ETDEWEB)

    Neuwirth, J.; Hefner, A.; Ernst, G. [Austrian Research Centers (ARC), Radiation Safety and Applications (Austria)


    In Dental Radiography Digital Volume Tomography (DVT) gains more and more importance due to its possibilities of three-dimensional imaging of teeth, jaw and the reduced radiation dose in comparison to conventional Computer tomography (CT). Contrary to other, well documented radiographic procedures like dental panorama X-ray imaging there are no national or international guidelines or recommendations relating to DVT which regulate the designation of areas and standardize risk assessment. This study aims to asses the parameters necessary for local radiation protection in dental practices. This paper describes the results of Measurements, which are carried out in dental practices in order to evaluate the local dose in varied distances by different rotation times of DVT devices. A (orig.)

  5. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J


    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  6. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.


    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  7. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)


    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  8. Operational lessons learned in conducting a multi-country collaboration for vaccine safety signal verification and hypothesis testing: The global vaccine safety multi country collaboration initiative. (United States)

    Guillard-Maure, Christine; Elango, Varalakshmi; Black, Steven; Perez-Vilar, Silvia; Castro, Jose Luis; Bravo-Alcántara, Pamela; Molina-León, Helvert Felipe; Weibel, Daniel; Sturkenboom, Miriam; Zuber, Patrick L F


    Timely and effective evaluation of vaccine safety signals for newly developed vaccines introduced in low and middle- income countries (LMICs) is essential. The study tested the development of a global network of hospital-based sentinel sites for vaccine safety signal verification and hypothesis testing. Twenty-six sentinel sites in sixteen countries across all WHO regions participated, and 65% of the sites were from LMIC. We describe the process for the establishment and operationalization of such a network and the lessons learned in conducting a multi-country collaborative initiative. 24 out of the 26 sites successfully contributed data for the global analysis using standardised tools and procedures. Our study successfully confirmed the well-known risk estimates for the outcomes of interest. The main challenges faced by investigators were lack of adequate information in the medical records for case ascertainment and classification, and access to immunization data. The results suggest that sentinel hospitals intending to participate in vaccine safety studies strengthen their systems for discharge diagnosis coding, medical records and linkage to vaccination data. Our study confirms that a multi-country hospital-based network initiative for vaccine safety monitoring is feasible and demonstrates the validity and utility of large collaborative international studies to monitor the safety of new vaccines introduced in LMICs. Copyright © 2017. Published by Elsevier Ltd.

  9. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator (United States)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses

  10. Modeling in the State Flow Environment to Support Launch Vehicle Verification Testing for Mission and Fault Management Algorithms in the NASA Space Launch System (United States)

    Trevino, Luis; Berg, Peter; England, Dwight; Johnson, Stephen B.


    Analysis methods and testing processes are essential activities in the engineering development and verification of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS). Central to mission success is reliable verification of the Mission and Fault Management (M&FM) algorithms for the SLS launch vehicle (LV) flight software. This is particularly difficult because M&FM algorithms integrate and operate LV subsystems, which consist of diverse forms of hardware and software themselves, with equally diverse integration from the engineering disciplines of LV subsystems. M&FM operation of SLS requires a changing mix of LV automation. During pre-launch the LV is primarily operated by the Kennedy Space Center (KSC) Ground Systems Development and Operations (GSDO) organization with some LV automation of time-critical functions, and much more autonomous LV operations during ascent that have crucial interactions with the Orion crew capsule, its astronauts, and with mission controllers at the Johnson Space Center. M&FM algorithms must perform all nominal mission commanding via the flight computer to control LV states from pre-launch through disposal and also address failure conditions by initiating autonomous or commanded aborts (crew capsule escape from the failing LV), redundancy management of failing subsystems and components, and safing actions to reduce or prevent threats to ground systems and crew. To address the criticality of the verification testing of these algorithms, the NASA M&FM team has utilized the State Flow environment6 (SFE) with its existing Vehicle Management End-to-End Testbed (VMET) platform which also hosts vendor-supplied physics-based LV subsystem models. The human-derived M&FM algorithms are designed and vetted in Integrated Development Teams composed of design and development disciplines such as Systems Engineering, Flight Software (FSW), Safety and Mission Assurance (S&MA) and major subsystems and vehicle elements

  11. Verification of the both hydrogeological and hydrogeochemical code results by an on-site test in granitic rocks

    Directory of Open Access Journals (Sweden)

    Michal Polák


    Full Text Available The project entitled “Methods and tools for the evaluation of the effect of engeneered barriers on distant interactions in the environment of a deep repository facility” deals with the ability to validate the behavior of applied engeneered barriers on hydrodynamic and migration parameters in the water-bearing granite environment of a radioactive waste deep repository facility. A part of the project represents a detailed mapping of the fracture network by means of geophysical and drilling surveys on the test-site (active granite quarry, construction of model objects (about 100 samples with the shape of cylinders, ridges and blocks, and the mineralogical, petrological and geochemical description of granite. All the model objects were subjected to migration and hydrodynamic tests with the use of fluorescein and NaCl as tracers. The tests were performed on samples with simple fractures, injected fractures and with an undisturbed integrity (verified by ultrasonic. The gained hydrodynamic and migration parameters of the model objects were processed with the modeling software NAPSAC and FEFLOW. During the following two years, these results and parameters will be verified (on the test-site by means of a long-term field test including the tuning of the software functionality.

  12. Verification Of Residual Strength Properties From Compression After Impact Tests On Thin CFRP Skin, A1 Honeycomb Composites (United States)

    Kalnins, Kaspars; Graham, Adrian J.; Sinnema, Gerben


    This article presents a study of CFRP/Al honeycomb panels subjected to a low velocity impact which, as a result, caused strength reduction. The main scope of the current study was to investigate experimental procedures, which are not well standardized and later verify them with numerical simulations. To ensure integrity of typical lightweight structural panels of modern spacecraft, knowledge about the impact energy required to produce clearly visible damage, and the resulting strength degradation is of high importance. For this initial investigation, Readily available ‘heritage’ (1980s) sandwich structure with relatively thin skin was used for this investigation. After initial attempts to produce impact damage, it was decided to create quasistatic indentation instead of low velocity impact, to cause barely visible damage. Forty two edgewise Compressions After Impact (CAI) test specimens have been produced and tested up to failure, while recording the strain distribution by optical means during the tests. Ultrasonic C-scan inspection was used to identify the damage evolution before and after each test. The optical strain measurements acquired during the tests showed sensitivity level capable to track the local buckling of damaged region.

  13. Interim Letter Report - Verification Survey Results for Activities Performed in March 2009 for the Vitrification Test Facility Warehouse at the West Valley Demonstration Project, Ashford, New York

    Energy Technology Data Exchange (ETDEWEB)

    B.D. Estes


    The objective of the verification activities was to provide independent radiological surveys and data for use by the Department of Energy (DOE) to ensure that the building satisfies the requirements for release without radiological controls.

  14. Full-Scale Experimental Verification of Soft-Story-Only Retrofits of Wood-Frame Buildings using Hybrid Testing (United States)

    Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld


    The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...

  15. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    Energy Technology Data Exchange (ETDEWEB)

    Onizuka, R [Graduate School of Health Sciences, Kumamoto University (Japan); Araki, F; Ohno, T [Faculty of Life Sciences, Kumamoto University (Japan); Nakaguchi, Y [Kumamoto University Hospital (Japan)


    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30% of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.

  16. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2 (United States)

    Platt, R.


    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  17. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, James W., LTC [Editor


    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  18. SB-LOCA beyond the design basis in a PWR experimental verification of am procedures in the PKL test facility

    Energy Technology Data Exchange (ETDEWEB)

    Mull, T.; Schoen, B.; Umminger, K.; Wegner, R. [Framatome ANP GmbH, Erlangen (Germany)


    The integral test facility PKL at the Technical Center of Framatome ANP (formerly Siemens/KWU) in Erlangen, Germany, simulates a 1300 MWe western type PWR. It is scaled by 1:145 in power and volume at original elevations. It features the entire primary side including four symmetrically arranged coolant loops and auxiliary and safety systems as well as the major part of the secondary side. The test series PKL III D, which was finished at the end of 1999, aimed at the exploration of safety margins and at the efficiency and optimization of operator initiated accident management (AM) procedures. Among others, several tests with small primary breaks combined with additional system failures were performed. This presentation describes test D3.1. The scenario under investigation was a small primary break (24 cm{sup 2} ) with simultaneous failure of the high pressure safety injection (HPSI), a beyond-design-basis scenario. For the German 1300 MWe PWRs, under such additional failure conditions, SB-LOCAs with leak sizes below 25 cm{sup 2} account for 18 % of the integral core damage frequency (CDF). This integral CDF can be estimated to be 3.1*10{sup -6} per year if no credit is taken from AM procedures. The break location in the test under consideration was in the cold leg between reactor coolant pump (RCP) and reactor pressure vessel (RPV). The assumed aggravating circumstances were HPSI failure and unavailability of 2 steam generators (SGs) as well as 3 out of 4 main steam relief and control valves (MS-RCV). The extra borating system was switched to injection mode at low pressurizer level but, by itself, would have been unable to maintain enough coolant to avoid core being uncovered before the pressure reached the setpoint of the accumulators (ACCs). The accident was managed by additional utilization of the chemical- and volume control system (CVCS) to inject water to partly neutralize the leak rate. The plant could be cooled down by 2 SGs using only one MS-RCV. The

  19. A 1:8.7 Scale Water Tunnel Verification & Validation Test of an Axial Flow Water Turbine

    Energy Technology Data Exchange (ETDEWEB)

    Fontaine, Arnold A. [Pennsylvania State Univ., University Park, PA (United States); Straka, William A. [Pennsylvania State Univ., University Park, PA (United States); Meyer, Richard S. [Pennsylvania State Univ., University Park, PA (United States); Jonson, Michael L. [Pennsylvania State Univ., University Park, PA (United States)


    As interest in waterpower technologies has increased over the last few years, there has been a growing need for a public database of measured data for these devices. This would provide a basic understanding of the technology and means to validate analytic and numerical models. Through collaboration between Sandia National Laboratories, Penn State University Applied Research Laboratory, and University of California, Davis, a new marine hydrokinetic turbine rotor was designed, fabricated at 1:8.7-scale, and experimentally tested to provide an open platform and dataset for further study and development. The water tunnel test of this three-bladed, horizontal-axis rotor recorded power production, blade loading, near-wake characterization, cavitation effects, and noise generation. This report documents the small-scale model test in detail and provides a brief discussion of the rotor design and an initial look at the results with comparison against low-order modeling tools. Detailed geometry and experimental measurements are released to Sandia National Laboratories as a data report addendum.

  20. Short communication: Parentage verification of South African ...

    African Journals Online (AJOL)

    Short communication: Parentage verification of South African Angora goats, using microsatellite markers. ... South African Journal of Animal Science. Journal Home ... Eighteen markers were tested in 192 South African Angora goats representing different family structures with known and unknown parent information.

  1. Seismic design verification of LMFBR structures

    Energy Technology Data Exchange (ETDEWEB)


    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  2. TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)


    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  3. BIOMEX Experiment: Ultrastructural Alterations, Molecular Damage and Survival of the Fungus Cryomyces antarcticus after the Experiment Verification Tests (United States)

    Pacelli, Claudia; Selbmann, Laura; Zucconi, Laura; De Vera, Jean-Pierre; Rabbow, Elke; Horneck, Gerda; de la Torre, Rosa; Onofri, Silvano


    The search for traces of extinct or extant life in extraterrestrial environments is one of the main goals for astrobiologists; due to their ability to withstand stress producing conditions, extremophiles are perfect candidates for astrobiological studies. The BIOMEX project aims to test the ability of biomolecules and cell components to preserve their stability under space and Mars-like conditions, while at the same time investigating the survival capability of microorganisms. The experiment has been launched into space and is being exposed on the EXPOSE-R2 payload, outside of the International Space Station (ISS) over a time-span of 1.5 years. Along with a number of other extremophilic microorganisms, the Antarctic cryptoendolithic black fungus Cryomyces antarcticus CCFEE 515 has been included in the experiment. Before launch, dried colonies grown on Lunar and Martian regolith analogues were exposed to vacuum, irradiation and temperature cycles in ground based experiments (EVT1 and EVT2). Cultural and molecular tests revealed that the fungus survived on rock analogues under space and simulated Martian conditions, showing only slight ultra-structural and molecular damage.

  4. Design of a Kaplan turbine for a wide range of operating head -Curved draft tube design and model test verification- (United States)

    KO, Pohan; MATSUMOTO, Kiyoshi; OHTAKE, Norio; DING, Hua


    As for turbomachine off-design performance improvement is challenging but critical for maximising the performing area. In this paper, a curved draft tube for a medium head Kaplan type hydro turbine is introduced and discussed for its significant effect on expanding operating head range. Without adding any extra structure and working fluid for swirl destruction and damping, a carefully designed outline shape of draft tube with the selected placement of center-piers successfully supresses the growth of turbulence eddy and the transport of the swirl to the outlet. Also, more kinetic energy is recovered and the head lost is improved. Finally, the model test results are also presented. The obvious performance improvement was found in the lower net head area, where the maximum efficiency improvement was measured up to 20% without compromising the best efficiency point. Additionally, this design results in a new draft tube more compact in size and so leads to better construction and manufacturing cost performance for prototype. The draft tube geometry parameter designing process was concerning the best efficiency point together with the off-design points covering various water net heads and discharges. The hydraulic performance and flow behavior was numerically previewed and visualized by solving Reynolds-Averaged Navier-Stokes equations with Shear Stress Transport turbulence model. The simulation was under the assumption of steady-state incompressible turbulence flow inside the flow passage, and the inlet boundary condition was the carefully simulated flow pattern from the runner outlet. For confirmation, the corresponding turbine efficiency performance of the entire operating area was verified by model test.

  5. Test/QA plan for the verification testing of diesel exhaust catalysts, particulate filters and engine modification control technologies for highway and nonroad use diesel engines (United States)

    This ETV test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research (DER) describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR Part 89 for nonroad engines, will be ...

  6. Test/QA plan for the verification testing of selective catalytic reduction control technologies for highway, nonroad use heavy-duty diesel engines (United States)

    This ETV test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research (DER) describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR Part 89 for nonroad engines, will be ...

  7. Scientific Verification Test of Orbitec Deployable Vegetable Production System for Salad Crop Growth on ISS- Gas Exchange System design and function (United States)

    Eldemire, Ashleigh


    The ability to produce and maintain salad crops during long term missions would be a great benefit to NASA; the renewable food supply would save cargo space, weight and money. The ambient conditions of previous ground controlled crop plant experiments do not reflect the microgravity and high CO2 concentrations present during orbit. It has been established that microgravity does not considerably alter plant growth. (Monje, Stutte, Chapman, 2005). To support plants in a space-craft environment efficient and effective lighting and containment units are necessary. Three lighting systems were previously evaluated for radish growth in ambient air; fluorescent lamps in an Orbitec Biomass Production System Educational (BPSE), a combination of red, blue, and green LED's in a Deployable Vegetable Production System (Veggie), and a combination of red and blue LED's in a Veggie. When mass measurements compared the entire possible growing area vs. power consumed by the respective units, the Veggies clearly exceeded the BPSE indicating that the LED units were a more resource efficient means of growing radishes under ambient conditions in comparison with fluorescent lighting. To evaluate the most productive light treatment system for a long term space mission a more closely simulated ISS environment is necessary. To induce a CO2 dense atmosphere inside the Veggie's and BPSE a gas exchange system has been developed to maintain a range of 1000-1200 ppm CO2 during a 21-day light treatment experiment. This report details the design and function of the gas exchange system. The rehabilitation, trouble shooting, maintenance and testing of the gas exchange system have been my major assignments. I have also contributed to the planting, daily measurements and harvesting of the radish crops 21-day light treatment verification test.

  8. Verification and Validation of Flight Critical Systems Project (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  9. Verification of the Performance of a Vertical Ground Heat Exchanger Applied to a Test House in Melbourne, Australia

    Directory of Open Access Journals (Sweden)

    Koon Beng Ooi


    circulation pumps and fans require low power that can be supplied by photovoltaic thermal (PVT. The EnergyPlus™ v8.7 object modeling the PVT requires user-defined efficiencies, so a PVT will be tested in the experimental house.

  10. Online fingerprint verification. (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K


    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  11. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels


    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system....... The system shows promising performance on the database with an accuracy of 99.71% overall and is robust to the variations in parking areas and weather conditions....

  12. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  13. Gender verification in competitive sports. (United States)

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E


    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  14. 40 CFR 86.1847-01 - Manufacturer in-use verification and in-use confirmatory testing; submittal of information and... (United States)


    ... Compliance Provisions for Control of Air Pollution From New and In-Use Light-Duty Vehicles, Light-Duty Trucks... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification and... 86.1847-01 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS...

  15. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology (United States)

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  16. Verification Account Management System (VAMS) (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  17. Verification is experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik


    The formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. Although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  18. Verification Is Experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik


    the formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  19. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S


    Full Text Available the performance of innovative environmental technologies can be verified by qualified third parties called "Verification Bodies". The "Statement of Verification" delivered at the end of the ETV process can be used as evidence that the claims made about...

  20. Automated continuous verification for numerical simulation

    Directory of Open Access Journals (Sweden)

    P. E. Farrell


    Full Text Available Verification is a process crucially important for the final users of a computational model: code is useless if its results cannot be relied upon. Typically, verification is seen as a discrete event, performed once and for all after development is complete. However, this does not reflect the reality that many geoscientific codes undergo continuous development of the mathematical model, discretisation and software implementation. Therefore, we advocate that in such cases verification must be continuous and happen in parallel with development: the desirability of their automation follows immediately. This paper discusses a framework for automated continuous verification of wide applicability to any kind of numerical simulation. It also documents a range of test cases to show the possibilities of the framework.

  1. Verification and Validation Studies for the LAVA CFD Solver (United States)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.


    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  2. Oral rivaroxaban versus standard therapy for the treatment of symptomatic venous thromboembolism: a pooled analysis of the EINSTEIN-DVT and PE randomized studies. (United States)

    Prins, Martin H; Lensing, Anthonie Wa; Bauersachs, Rupert; van Bellen, Bonno; Bounameaux, Henri; Brighton, Timothy A; Cohen, Alexander T; Davidson, Bruce L; Decousus, Hervé; Raskob, Gary E; Berkowitz, Scott D; Wells, Philip S


    Standard treatment for venous thromboembolism (VTE) consists of a heparin combined with vitamin K antagonists. Direct oral anticoagulants have been investigated for acute and extended treatment of symptomatic VTE; their use could avoid parenteral treatment and/or laboratory monitoring of anticoagulant effects. A prespecified pooled analysis of the EINSTEIN-DVT and EINSTEIN-PE studies compared the efficacy and safety of rivaroxaban (15 mg twice-daily for 21 days, followed by 20 mg once-daily) with standard-therapy (enoxaparin 1.0 mg/kg twice-daily and warfarin or acenocoumarol). Patients were treated for 3, 6, or 12 months and followed for suspected recurrent VTE and bleeding. The prespecified noninferiority margin was 1.75. A total of 8282 patients were enrolled; 4151 received rivaroxaban and 4131 received standard-therapy. The primary efficacy outcome occurred in 86 (2.1%) rivaroxaban-treated patients compared with 95 (2.3%) standard-therapy-treated patients (hazard ratio, 0.89; 95% confidence interval [CI], 0.66-1.19; pnoninferiority EINSTEIN-DVT:, NCT00440193.

  3. ALMA Science Verification (United States)

    Hills, R.


    As many of you are aware, ALMA has reached a very exciting point in the construction phase. After a year of testing the basic functionality of antennas and small arrays at the Chajnantor site at 5000m, we are now able to run full observations of scientific targets using at least 8 antennas and 4 receiver bands. We recently had a series of reviews of all aspects of the ALMA Project, resulting in a consensus that we will be ready to issue a Call for Proposals for Early Science projects at the end of the first quarter of 2011, with an expectation of starting these Early Science observations toward the end of 2011. ALMA Science Verification is the process by which we will demonstrate that the data that will be produced by ALMA during Early Science is valid. This is done by running full "end to end" tests of ALMA as a telescope. We will observe objects for which similar data are already available for other telescopes. This allows us to make direct quantitative comparisons of all aspects of the data cubes, in order to determine whether the ALMA instrumentation or software is introducing any artifacts.

  4. Software Verification of Orion Cockpit Displays (United States)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee


    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  5. Guidelines for Formal Verification Systems (United States)


    This document explains the requirements for formal verification systems that are candidates for the NCSC’s Endorsed Tools List (ETL). This primarily intended for developers of verification systems to use in the development of production-quality formal verification systems. It explains...the requirements and the process used to evaluate formal verification systems submitted to the NCSC for endorsement.

  6. Standard Verification System (SVS) (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  7. SSN Verification Service (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  8. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.


    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  9. Fiscal 1997 report of the development of high efficiency waste power generation technology. No.2 volume. Pilot plant verification test; Kokoritsu haikibutsu hatsuden gijutsu kaihatsu (pilot plant jissho shiken). 1997 nendo hokokusho (daini bunsatsu)

    Energy Technology Data Exchange (ETDEWEB)



    As to a high efficiency waste power generation system using general waste as fuel, the details of the following were described: design/construction management and operational study of pilot plant, design/manufacture/construction of pilot plant, and study of an optimal total system. Concerning the construction management and operational study, the paper described the application for governmental/official inspection procedures and taking inspection, process management of pilot plant, site patrol, safety management, management of trial run of pilot plant, drawing-up of a verification test plan and test run, etc. Relating to the design/manufacture/construction of pilot plant, an outline of the pilot plant was described. The paper also stated points to be considered in design of furnace structure and boiler structure, points to be considered of the verification test, etc. As to the study of an optimal total system, the following were described: survey of waste gasification/slagging power generation technology, basic study on RDF production process, survey of trends of waste power generation technology in the U.S., etc. 52 refs., 149 figs., 121 tabs.

  10. Current status of verification practices in clinical biochemistry in Spain. (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè


    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  11. Verification of Maximal Oxygen Uptake in Obese and Nonobese Children. (United States)

    Bhammar, Dharini M; Stickford, Jonathon L; Bernhardt, Vipa; Babb, Tony G


    The purpose of this study was to examine whether a supramaximal constant-load verification test at 105% of the highest work rate would yield a higher V˙O2max when compared with an incremental test in 10- to 12-yr-old nonobese and obese children. Nine nonobese (body mass index percentile = 57.5 ± 23.2) and nine obese (body mass index percentile = 97.9 ± 1.4) children completed a two-test protocol that included an incremental test followed 15 min later by a supramaximal constant-load verification test. The V˙O2max achieved in verification testing (nonobese = 1.71 ± 0.31 L·min and obese = 1.94 ± 0.47 L·min) was significantly higher than that achieved during the incremental test (nonobese = 1.57 ± 0.27 L·min and obese = 1.84 ± 0.48 L·min; P verification) interaction, suggesting that there was no effect of obesity on the difference between verification and incremental V˙O2max (P = 0.747). A verification test yielded significantly higher values of V˙O2max when compared with the incremental test in obese children. Similar results were observed in nonobese children. Supramaximal constant-load verification is a time-efficient and well-tolerated method for identifying the highest V˙O2 in nonobese and obese children.

  12. Structural verification of an aged composite reflector (United States)

    Lou, Michael C.; Tsuha, Walter S.


    A structural verification program applied to qualifying two heritage composite antenna reflectors for flight on the TOPEX satellite is outlined. The verification requirements and an integrated analyses/test approach employed to meet these requirements are described. Structural analysis results and qualification vibration test data are presented and discussed. It was determined that degradation of the composite and bonding materials caused by long-term exposure to an uncontrolled environment had not severely impaired the integrity of the reflector structures. The reflectors were assessed to be structurally adequate for the intended TOPEX application.

  13. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus


    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  14. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server



    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  15. Certification and verification for calmac flat plate solar collector

    Energy Technology Data Exchange (ETDEWEB)


    This document contains information used in the certification and verification of the Calmac Flat Plate Collector. Contained are such items as test procedures and results, information on materials used, Installation, Operation, and Maintenance Manuals, and other information pertaining to the verification and certification.

  16. Grip-pattern verification for a smart gun

    NARCIS (Netherlands)

    Shang, X.; Groenland, J.P.J.; Groenland, J.P.J.; Veldhuis, Raymond N.J.

    In the biometric verification system of a smart gun, the rightful user of the gun is recognized based on grip-pattern recognition. It was found that the verification performance of grip-pattern recognition degrades strongly when the data for training and testing the classifier, respectively, have

  17. Report of the subpanel on methods of verification (United States)


    A program to improve the state of understanding and of the meaning of verification and the application of verification procedures to a variety of sensor systems is recommended. The program would involve an experimental hands-on data demonstration and evaluation of those procedures in a controlled test bed experiment.

  18. Monitoring and verification R&D

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, Joseph F [Los Alamos National Laboratory; Budlong - Sylvester, Kory W [Los Alamos National Laboratory; Fearey, Bryan L [Los Alamos National Laboratory


    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  19. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.


    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy


    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Acute thromboscintigraphy with (99m)Tc-apcitide: results of the phase 3 multicenter clinical trial comparing 99mTc-apcitide scintigraphy with contrast venography for imaging acute DVT. Multicenter Trial Investigators. (United States)

    Taillefer, R; Edell, S; Innes, G; Lister-James, J


    (99m)Tc-apcitide (formerly known as (99m)Tc-P280) is a radiolabeled peptide that binds with high affinity and specificity to the glycoprotein IIb/IIIa receptors expressed on the activated platelets that are involved in acute thrombosis. The purpose of the phase 3 multicenter clinical trials was to compare (99m)Tc-apcitide scintigraphy with contrast venography for imaging acute deep venous thrombosis (DVT). A total of 280 patients were enrolled in 2 clinical trials conducted in North America and Europe. Patients were to be within 10 d of onset of signs and symptoms of acute DVT or within 10 d of surgery associated with a high risk of DVT. (99m)Tc-apcitide scintigraphy and contrast venography were to be performed within 36 h. Planar scintigraphic images were obtained at 10, 60, and 120-180 min after injection. (99m)Tc-apcitide scintigrams and contrast venograms were read with masking and also by the institutional investigators. Of a total of 243 patients who were evaluable, 61.7% were receiving heparin at the time of imaging. Masked reading of (99m)Tc-apcitide scintigraphy, compared with masked reading of contrast venography, had a sensitivity, specificity, and agreement of 73.4%, 67.5%, and 69.1%, respectively, which met the prospectively defined target efficacy endpoint in both trials. Institutional reading of (99m)Tc-apcitide scintigraphy, compared with institutional reading of contrast venography, had a sensitivity, specificity, and agreement of 75.5%, 72.8%, and 74.0%, respectively. However, the entire trial population included patients with a history of DVT who may have had old, nonacute venous thrombi that could confound the venography results. Therefore, data from patients having no history of DVT or pulmonary embolism and who presented within 3 d of onset of signs and symptoms (n = 63), i.e., patients for whom a venogram would be expected to be positive only if acute DVT were present, also were analyzed as a subset. In these patients, institutional reading

  2. Turbulence Modeling Verification and Validation (United States)

    Rumsey, Christopher L.


    steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  3. Cognitive Bias in Systems Verification (United States)

    Larson, Steve


    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  4. Writer identification and verification

    NARCIS (Netherlands)

    Schomaker, Lambert; Ratha, N; Govindaraju, V


    Writer identification and verification have gained increased interest recently, especially in the fields of forensic document examination and biometrics. Writer identification assigns a handwriting to one writer out of a set of writers. It determines whether or not a given handwritten text has in

  5. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael


    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  6. Verification of method performance for clinical laboratories. (United States)

    Nichols, James H


    Method verification, a one-time process to determine performance characteristics before a test system is utilized for patient testing, is often confused with method validation, establishing the performance of a new diagnostic tool such as an internally developed or modified method. A number of international quality standards (International Organization for Standardization (ISO) and Clinical Laboratory Standards Institute (CLSI)), accreditation agency guidelines (College of American Pathologists (CAP), Joint Commission, U.K. Clinical Pathology Accreditation (CPA)), and regional laws (Clinical Laboratory Improvement Amendments of 1988 (CLIA'88)) exist describing the requirements for method verification and validation. Consumers of marketed test kits should verify method accuracy, precision, analytic measurement range, and the appropriateness of reference intervals to the institution's patient population. More extensive validation may be required for new methods and those manufacturer methods that have been modified by the laboratory, including analytic sensitivity and specificity. This manuscript compares the various recommendations for method verification and discusses the CLSI evaluation protocols (EP) that are available to guide laboratories in performing method verification experiments.

  7. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie


    clinical pre-test probability (PTP) can be safely used to rule out the tentative diagnosis of DVT in cancer patients. However, the accuracy in colorectal cancer patients is uncertain. This study assessed the diagnostic accuracy of a quantitative D-dimer assay in combination with the PTP score in ruling out....... The negative predictive value, positive predictive value, sensitivity and specificity were 99% (95% confidence interval (CI), 95-100%), 17% (95% CI, 9-26), 93% (95% CI, 68-100%) and 61% (95% CI, 53-68%), respectively. In conclusion, the combined use of pre-test probability and D-dimer test may be useful......The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...

  8. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)


    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  9. A Tutorial on Text-Independent Speaker Verification

    Directory of Open Access Journals (Sweden)

    Frédéric Bimbot


    Full Text Available This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  10. Quantitative reactive modeling and verification. (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.


    The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...

  12. Visual inspection for CTBT verification

    Energy Technology Data Exchange (ETDEWEB)

    Hawkins, W.; Wohletz, K.


    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  13. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)



    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  14. GRAVITY Science Verification (United States)

    Mérand, A.; Berger, J.-P.; de Wit, W.-J.; Eisenhauer, F.; Haubois, X.; Paumard, T.; Schoeller, M.; Wittkowski, M.; Woillez, J.; Wolff, B.


    In the time between successfully commissioning an instrument and before offering it in the Call for Proposals for the first time, ESO gives the community at large an opportunity to apply for short Science Verification (SV) programmes. In 2016, ESO offered SV time for the second-generation Very Large Telescope Interferometer instrument GRAVITY. In this article we describe the selection process, outline the range of science cases covered by the approved SV programmes, and highlight some of the early scientific results.

  15. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty (United States)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias


    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  16. Gender verification of female athletes. (United States)

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A


    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on

  17. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip


    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  18. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.


    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  19. Thoughts on Verification of Nuclear Disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, W H


    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was

  20. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo


    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  1. Earthquake induced rock shear through a deposition hole. Modelling of three model tests scaled 1:10. Verification of the bentonite material model and the calculation technique

    Energy Technology Data Exchange (ETDEWEB)

    Boergesson, Lennart (Clay Technology AB, Lund (Sweden)); Hernelind, Jan (5T Engineering AB, Vaesteraas (Sweden))


    Three model shear tests of very high quality simulating a horizontal rock shear through a deposition hole in the centre of a canister were performed 1986. The tests and the results are described by /Boergesson 1986/. The tests simulated a deposition hole in the scale 1:10 with reference density of the buffer, very stiff confinement simulating the rock, and a solid bar of copper simulating the canister. The three tests were almost identical with exception of the rate of shear, which was varied between 0.031 and 160 mm/s, i.e. with a factor of more than 5,000 and the density of the bentonite, which differed slightly. The tests were very well documented. Shear force, shear rate, total stress in the bentonite, strain in the copper and the movement of the top of the simulated canister were measured continuously during the shear. After finished shear the equipment was dismantled and careful sampling of the bentonite with measurement of water ratio and density were made. The deformed copper 'canister' was also carefully measured after the test. The tests have been modelled with the finite element code Abaqus with the same models and techniques that were used for the full scale scenarios in SR-Site. The results have been compared with the measured results, which has yielded very valuable information about the relevancy of the material models and the modelling technique. An elastic-plastic material model was used for the bentonite where the stress-strain relations have been derived from laboratory tests. The material model is made a function of both the density and the strain rate at shear. Since the shear is fast and takes place under undrained conditions, the density is not changed during the tests. However, strain rate varies largely with both the location of the elements and time. This can be taken into account in Abaqus by making the material model a function of the strain rate for each element. A similar model, based on tensile tests on the copper used in

  2. Optical secure image verification system based on ghost imaging (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian


    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  3. Mechanistic Model for Ash Deposit Formation in Biomass Suspension-Fired Boilers. Part 2: Model Verification by Use of Full Scale Tests

    DEFF Research Database (Denmark)

    Hansen, Stine Broholm; Jensen, Peter Arendt; Jappe Frandsen, Flemming


    A model for deposit formation in suspension firing of biomass has been developed. The model describes deposit build-up by diffusion and subsequent condensation of vapors, thermoforesis of aerosols, convective diffusion of small particles, impaction of large particles and reaction. The model...... describes particle sticking or rebound by a combination of the description of (visco)elsatic particles impacting a solid surface and particle capture by a viscous surface. The model is used to predict deposit formation rates measured during tests conducted with probes in full-scale suspension-fired biomass...... boilers. The rates predicted by the model was reasonably able to follow the rates observed in the tests, although with some variation, primarily as overestimations of the deposit formation rates. It is considered that the captive properties of the deposit surface are overestimated. Further examination...

  4. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M


    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  5. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva


    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: [2]:

  6. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)


    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  7. Overview of Code Verification (United States)


    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  8. Nearest-Neighbor Estimation for ROC Analysis under Verification Bias. (United States)

    Adimari, Gianfranco; Chiogna, Monica


    For a continuous-scale diagnostic test, the receiver operating characteristic (ROC) curve is a popular tool for displaying the ability of the test to discriminate between healthy and diseased subjects. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the test result and other characteristics of the subjects. Estimators of the ROC curve based only on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias, in particular under the assumption that the true disease status, if missing, is missing at random (MAR). MAR assumption means that the probability of missingness depends on the true disease status only through the test result and observed covariate information. However, the existing methods require parametric models for the (conditional) probability of disease and/or the (conditional) probability of verification, and hence are subject to model misspecification: a wrong specification of such parametric models can affect the behavior of the estimators, which can be inconsistent. To avoid misspecification problems, in this paper we propose a fully nonparametric method for the estimation of the ROC curve of a continuous test under verification bias. The method is based on nearest-neighbor imputation and adopts generic smooth regression models for both the probability that a subject is diseased and the probability that it is verified. Simulation experiments and an illustrative example show the usefulness of the new method. Variance estimation is also discussed.

  9. Round-Robin Verification and Final Development of the IEC 62788-1-5 Encapsulation Size Change Test; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Wohlgemuth, J.; Bokria, J.; Gu, X.; Honeker, C.; Murua, N.; Nickel, N.; Sakurai, K.; Shioda, T.; Tamizhmani, G.; Wang, E.; Yang, S.; Yoshihara, T.


    Polymeric encapsulation materials may a change size when processed at typical module lamination temperatures. The relief of residual strain, trapped during the manufacture of encapsulation sheet, can affect module performance and reliability. For example, displaced cells and interconnects threaten: cell fracture; broken interconnects (open circuits and ground faults); delamination at interfaces; and void formation. A standardized test for the characterization of change in linear dimensions of encapsulation sheet has been developed and verified. The IEC 62788-1-5 standard quantifies the maximum change in linear dimensions that may occur to allow for process control of size change. Developments incorporated into the Committee Draft (CD) of the standard as well as the assessment of the repeatability and reproducibility of the test method are described here. No pass/fail criteria are given in the standard, rather a repeatable protocol to quantify the change in dimension is provided to aid those working with encapsulation. The round-robin experiment described here identified that the repeatability and reproducibility of measurements is on the order of 1%. Recent refinements to the test procedure to improve repeatability and reproducibility include: the use of a convection oven to improve the thermal equilibration time constant and its uniformity; well-defined measurement locations reduce the effects of sampling size -and location- relative to the specimen edges; a standardized sand substrate may be readily obtained to reduce friction that would otherwise complicate the results; specimen sampling is defined, so that material is examined at known sites across the width and length of rolls; and encapsulation should be examined at the manufacturer’s recommended processing temperature, except when a cross-linking reaction may limit the size change. EVA, for example, should be examined 100 °C, between its melt transition (occurring up to 80 °C) and the onset of cross

  10. Functions of social support and self-verification in association with loneliness, depression, and stress. (United States)

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny


    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  11. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    Directory of Open Access Journals (Sweden)

    Njeh Christopher F


    Full Text Available Abstract Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID or computed radiography (CR. We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence.

  12. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian


    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  13. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    methods in the verification task. Today formal verification is finding increasing acceptance ... approaches that are major research issues in formal verification research today. There are four articles in this issue, which show up the different flavours in the approach to formal methods in verification. The first paper by Supratik ...

  14. Verification methodology manual for SystemVerilog

    CERN Document Server

    Bergeron, Janick; Hunter, Alan


    SystemVerilog is a unified language that serves both design and verification engineers by including RTL design constructs, assertions and a rich set of verification constructs. This book is based upon best verification practices by ARM, Synopsys and their customers. It is useful for those involved in the design or verification of a complex chip.

  15. Performance verification and system integration tests of the pulse shape processor for the soft x-ray spectrometer onboard ASTRO-H (United States)

    Takeda, Sawako; Tashiro, Makoto S.; Ishisaki, Yoshitaka; Tsujimoto, Masahiro; Seta, Hiromi; Shimoda, Yuya; Yamaguchi, Sunao; Uehara, Sho; Terada, Yukikatsu; Fujimoto, Ryuichi; Mitsuda, Kazuhisa


    The soft X-ray spectrometer (SXS) aboard ASTRO-H is equipped with dedicated digital signal processing units called pulse shape processors (PSPs). The X-ray microcalorimeter system SXS has 36 sensor pixels, which are operated at 50 mK to measure heat input of X-ray photons and realize an energy resolution of 7 eV FWHM in the range 0.3-12.0 keV. Front-end signal processing electronics are used to filter and amplify the electrical pulse output from the sensor and for analog-to-digital conversion. The digitized pulses from the 36 pixels are multiplexed and are sent to the PSP over low-voltage differential signaling lines. Each of two identical PSP units consists of an FPGA board, which assists the hardware logic, and two CPU boards, which assist the onboard software. The FPGA board triggers at every pixel event and stores the triggering information as a pulse waveform in the installed memory. The CPU boards read the event data to evaluate pulse heights by an optimal filtering algorithm. The evaluated X-ray photon data (including the pixel ID, energy, and arrival time information) are transferred to the satellite data recorder along with event quality information. The PSP units have been developed and tested with the engineering model (EM) and the flight model. Utilizing the EM PSP, we successfully verified the entire hardware system and the basic software design of the PSPs, including their communication capability and signal processing performance. In this paper, we show the key metrics of the EM test, such as accuracy and synchronicity of sampling clocks, event grading capability, and resultant energy resolution.

  16. Numident Online Verification Utility (NOVU) (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  17. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer


    to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow....... It is shown that dual decomposition can be applied on the problem of generating barrier certificates, resulting in a compositional formulation of the safety verification problem. This makes the barrier certificate method applicable to the verification of high dimensional systems, but at the cost...

  18. Biometric verification of a subject through eye movements. (United States)

    Juhola, Martti; Zhang, Youming; Rasku, Jyrki


    Matching digital fingerprint, face or iris images, biometric verification of persons has advanced. Notwithstanding the progress, this is no easy computational task because of great numbers of complicated data. Since the 1990s, eye movements previously only applied to various tests of medicine and psychology are also studied for the purpose of computer interfaces. Such a short one-dimensional measurement signal contains less data than images and may therefore be simpler and faster to recognize. Using saccadic eye movements we developed a computational verification method to reliably distinguish a legitimate person or a subject in general from others. We tested features extracted from signals recorded from saccade eye movements. We used saccades of 19 healthy subjects and 21 otoneurological patients recorded with electro-oculography and additional 40 healthy subjects recorded with a videocamera system. Verification tests produced high accuracies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)


    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  20. Global survey of malaria rapid diagnostic test (RDT) sales, procurement and lot verification practices: assessing the use of the WHO-FIND Malaria RDT Evaluation Programme (2011-2014). (United States)

    Incardona, Sandra; Serra-Casas, Elisa; Champouillon, Nora; Nsanzabana, Christian; Cunningham, Jane; González, Iveth J


    Malaria rapid diagnostic tests (RDTs) play a critical role in malaria case management, and assurance of quality is a key factor to promote good adherence to test results. Since 2007, the World Health Organization (WHO) and the Foundation for Innovative New Diagnostics (FIND) have coordinated a Malaria RDT Evaluation Programme, comprising a pre-purchase performance evaluation (product testing, PT) and a pre-distribution quality control of lots (lot testing, LT), the former being the basis of WHO recommendations for RDT procurement. Comprehensive information on malaria RDTs sold worldwide based on manufacturers' data and linked to independent performance data is currently not available, and detailed knowledge of procurement practices remains limited. The use of the PT/LT Programme results as well as procurement and lot verification practices were assessed through a large-scale survey, gathering product-specific RDT sales and procurement data (2011-14 period) from a total of 32 manufacturers, 12 procurers and 68 National Malaria Control Programmes (NMCPs). Manufacturers' reports showed that RDT sales had more than doubled over the four years, and confirmed a trend towards increased compliance with the WHO procurement criteria (from 83% in 2011 to 93% in 2014). Country-level reports indicated that 74% of NMCPs procured only 'WHO-compliant' RDT products, although procurers' transactions datasets revealed a surprisingly frequent overlap of different products and even product types (e.g., Plasmodium falciparum-only and Plasmodium-pan) in the same year and country (60 and 46% of countries, respectively). Importantly, the proportion of 'non-complying' (i.e., PT low scored or not evaluated) products was found to be higher in the private health care sector than in the public sector (32% vs 5%), and increasing over time (from 22% of private sector sales in 2011 to 39% in 2014). An estimated 70% of the RDT market was covered by the LT programme. The opinion about the PT

  1. Solid waste operations complex engineering verification program plan

    Energy Technology Data Exchange (ETDEWEB)

    Bergeson, C.L.


    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project`s Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP.

  2. Verification of computational models of cardiac electro-physiology. (United States)

    Pathmanathan, Pras; Gray, Richard A


    For computational models of cardiac activity to be used in safety-critical clinical decision-making, thorough and rigorous testing of the accuracy of predictions is required. The field of 'verification, validation and uncertainty quantification' has been developed to evaluate the credibility of computational predictions. The first stage, verification, is the evaluation of how well computational software correctly solves the underlying mathematical equations. The aim of this paper is to introduce novel methods for verifying multi-cellular electro-physiological solvers, a crucial first stage for solvers to be used with confidence in clinical applications. We define 1D-3D model problems with exact solutions for each of the monodomain, bidomain, and bidomain-with-perfusing-bath formulations of cardiac electro-physiology, which allow for the first time the testing of cardiac solvers against exact errors on fully coupled problems in all dimensions. These problems are carefully constructed so that they can be easily run using a general solver and can be used to greatly increase confidence that an implementation is correct, which we illustrate by testing one major solver, 'Chaste', on the problems. We then perform case studies on calculation verification (also known as solution verification) for two specific applications. We conclude by making several recommendations regarding verification in cardiac modelling. Copyright © 2013 John Wiley & Sons, Ltd.


    The U.S. Environmental Protection Agency Air Pollution Control Technology (APCT) Verification Center evaluates the performance of baghouse filtration products used primarily to control PM2.5 emissions. This verification statement summarizes the test results for W.L. Gore & Assoc....

  4. 40 CFR 1065.341 - CVS and batch sampler verification (propane check). (United States)


    ... (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Flow-Related... engineering judgment and safe practices, this check may be performed using a gas other than propane, such as... components. (3) Poor mixing. Perform the verification as described in this section while traversing a...

  5. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra


    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  6. Seismic Surveillance. Nuclear Test Ban Verification (United States)


    as the Horn Graben. RFH appears to have been elevated in the Carboniferous while taphrogenesis has been related to the late Carboniferous - early...SCANDINAVIA: CARBONIFEROUS TO PRESENT The western part of S. Scandinavia has experienced volcanic activity as shown in Fig. 8. Paleozoic magmatism is...with age ranges from Permian to Tertiary. In summary, evidence of volcanic activity throughout Carboniferous to present have been found and is

  7. Seismic Surveillance - Nuclear Test Ban Verification (United States)


    Reports. 1215 Jefferson Oavis Highway. Suite 1204. Arlington. VA 22202-4302. and to the Office of Management and Budget . Paperwork Reduction Project (0704...shear In the lower crust below Skagerrak. Conventional rifting scenarios incorporating magmatic underplating of Moho Is not considered tenable In our...Fig. 3b). The Skagerrak block is detached along the FFZ (18) (Fig.1 and 2c) where Permian magmatic activity has been reported (29,20). Finally

  8. Built-in-Test Verification Techniques (United States)


    are currently underway to develop military applications. The field of artificial intelligence generally includes natural language processing, robotics ...because o2 its applicability being limited to analog circuit.• . This narrowed the evaluation to the 11Mf and simulation approaches. The current flEA

  9. Code Verification by the Method of Manufactured Solutions

    Energy Technology Data Exchange (ETDEWEB)



    A procedure for code Verification by the Method of Manufactured Solutions (MMS) is presented. Although the procedure requires a certain amount of creativity and skill, we show that MMS can be applied to a variety of engineering codes which numerically solve partial differential equations. This is illustrated by detailed examples from computational fluid dynamics. The strength of the MMS procedure is that it can identify any coding mistake that affects the order-of-accuracy of the numerical method. A set of examples which use a blind-test protocol demonstrates the kinds of coding mistakes that can (and cannot) be exposed via the MMS code Verification procedure. The principle advantage of the MMS procedure over traditional methods of code Verification is that code capabilities are tested in full generality. The procedure thus results in a high degree of confidence that all coding mistakes which prevent the equations from being solved correctly have been identified.

  10. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno


    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  11. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi


    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  12. Achievement report for fiscal 1998 on the development of superconductor power application technology. 2. Research and development of superconducting wire and superconductive power generator, research of total system, research and development of refrigeration system, and verification test; 1998 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 2. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total sytsem no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    The slow excitation response type power generator is studied when the rotor and stator of a 70,000kW-class model are combinedly subjected to an on-site verification test, when a good result is obtained. The rotor is disassembled for inspection, and its members are found to be sound without any problem in terms of mechanical strength. The quick excitation response type is studied when a 70,000kW model is experimentally built and subjected to an on-site verification test after a rotation and excitation test in the factory, when the pilot machine concept design is reviewed. In the study of a total system, efforts continue for the review of the model machine test method, improvement on generator design and analytical methods, development of operating methods, and the effect of its introduction into the power system. Since a He-refrigerated system is requested to exhibit high reliability for application to power equipment and to be capable of continuous long-period operation, a system having constituents with their reliability enhanced and an appropriate redundant system is developed, and a verification study is under way which will continue for more than 10,000 hours. Described also is an oil-free low-temperature turbo refrigerator. The latest quick excitation response type rotor is also tested for verification. (NEDO)

  13. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)


    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  14. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R


    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  15. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti


    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  16. Formal verification of mathematical software (United States)

    Sutherland, D.


    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  17. Pairwise Identity Verification via Linear Concentrative Metric Learning. (United States)

    Zheng, Lilei; Duffner, Stefan; Idrissi, Khalid; Garcia, Christophe; Baskurt, Atilla


    This paper presents a study of metric learning systems on pairwise identity verification, including pairwise face verification and pairwise speaker verification, respectively. These problems are challenging because the individuals in training and testing are mutually exclusive, and also due to the probable setting of limited training data. For such pairwise verification problems, we present a general framework of metric learning systems and employ the stochastic gradient descent algorithm as the optimization solution. We have studied both similarity metric learning and distance metric learning systems, of either a linear or shallow nonlinear model under both restricted and unrestricted training settings. Extensive experiments demonstrate that with limited training pairs, learning a linear system on similar pairs only is preferable due to its simplicity and superiority, i.e., it generally achieves competitive performance on both the labeled faces in the wild face dataset and the NIST speaker dataset. It is also found that a pretrained deep nonlinear model helps to improve the face verification results significantly.

  18. Use of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments Across International Borders -Test/QA Plan (United States)

    The Environmental Technology Verification (ETV) – Environmental and Sustainable Technology Evaluations (ESTE) Program conducts third-party verification testing of commercially available technologies that may accomplish environmental program management goals. In this verification...

  19. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka


    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  20. A Formal Approach to the Verification of Networks on Chip

    Directory of Open Access Journals (Sweden)

    Schmaltz Julien


    Full Text Available Abstract The current technology allows the integration on a single die of complex systems-on-chip (SoCs that are composed of manufactured blocks (IPs, interconnected through specialized networks on chip (NoCs. IPs have usually been validated by diverse techniques (simulation, test, formal verification and the key problem remains the validation of the communication infrastructure. This paper addresses the formal verification of NoCs by means of a mechanized proof tool, the ACL2 theorem prover. A metamodel for NoCs has been developed and implemented in ACL2. This metamodel satisfies a generic correctness statement. Its verification for a particular NoC instance is reduced to discharging a set of proof obligations for each one of the NoC constituents. The methodology is demonstrated on a realistic and state-of-the-art design, the Spidergon network from STMicroelectronics.

  1. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.


    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  2. A microsatellite panel for triploid verification in the abalone Haliotis ...

    African Journals Online (AJOL)

    A method for ploidy verification of triploid and diploid Haliotis midae was developed using molecular microsatellite markers. In all, 30 microsatellite loci were tested in control populations. A final micro satellite multiplex consisting of seven markers were optimised and a complete protocol is reported. This protocol was ...

  3. Algebraic verification of a distributed summation algorithm


    Groote, Jan Friso; Springintveld, J.G.


    textabstractIn this note we present an algebraic verification of Segall's Propagation of Information with Feedback (PIF) algorithm. This algorithm serves as a nice benchmark for verification exercises (see [2, 13, 8]). The verification is based on the methodology presented in [7] and demonstrates its applicability to distributed algorithms.

  4. Gender verification of female Olympic athletes. (United States)

    Dickinson, Barry D; Genel, Myron; Robinowitz, Carolyn B; Turner, Patricia L; Woods, Gary L


    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Problems include invalid screening tests, failure to understand the problems of intersex, the discriminatory singling out of women based only on laboratory results, and the stigmatization and emotional trauma experienced by individuals screened positive. Genuine sex-impostors have not been uncovered by laboratory-based genetic testing; however, gender verification procedures have resulted in substantial harm to a number of women athletes born with relatively rare genetic abnormalities. Individuals with sex-related genetic abnormalities raised as females have no unfair physical advantage and should not be excluded or stigmatized, including those with 5-alpha-steroid-reductase deficiency, partial or complete androgen insensitivity, and chromosomal mosaicism. In 1990, the International Amateur Athletics Federation (IAAF) called for ending genetic screening of female athletes and in 1992 adopted an approach designed to prevent only male impostors from competing. The IAAF recommended that the "medical delegate" have the ultimate authority in all medical matters, including the authority to arrange for the determination of the gender of the competitor if that approach is judged necessary. The new policy advocated by the IAAF, and conditionally adopted by the International Olympic Committee, protects the rights and privacy of athletes while safeguarding fairness of competition, and the American Medical Association recommends that it become the permanent approach.

  5. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)



    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  6. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    followed by a detailed) layout of the various components of the system. The next step is the coding phase which usually also includes testing of the individual modules that are coded. Coding is followed by testing of the components and successful ...

  7. Program Verification and System Dependability (United States)

    Jackson, Michael

    Formal verification of program correctness is a long-standing ambition, recently given added prominence by a “Grand Challenge” project. Major emphases have been on the improvement of languages for program specification and program development, and on the construction of verification tools. The emphasis on tools commands general assent, but while some researchers focus on narrow verification aimed only at program correctness, others want to pursue wide verification aimed at the larger goal of system dependability. This paper presents an approach to system dependability based on problem frames and suggests how this approach can be supported by formal software tools. Dependability is to be understood and evaluated in the physical and human problem world of a system. The complexity and non-formal nature of the problem world demand the development and evolution of normal designs and normal design practices for specialised classes of systems and subsystems. The problem frames discipline of systems analysis and development that can support normal design practices is explained and illustrated. The role of formal reasoning in achieving dependability is discussed and some conceptual, linguistic and software tools are suggested.

  8. Ultrasonic verification of composite structures

    NARCIS (Netherlands)

    Pelt, Maurice; de Boer, Robert Jan; Schoemaker, Christiaan; Sprik, Rudolf


    Ultrasonic Verification is a new method for the monitoring large surface areas of CFRP by ultrasound with few sensors. The echo response of a transmitted pulse through the structure is compared with the response of an earlier obtained reference signal to calculate a fidelity parameter.

  9. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas


    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples: a t...

  10. Private Verification for FPGA Bitstreams (United States)


    security risks. Keywords: Trust, Privacy, Hardware Trojan, Hardware Security, ASIC, FPGA, Bitstream Introduction Many effective verification...devices but also to integrate PV-Bit, other Graf Research tools, and other commercial EDA tools into our overarching forward design trust flow philosophy

  11. Improved method for coliform verification.


    Diehl, J D


    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay.

  12. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander


    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present...


    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  14. A nonparametric maximum likelihood estimator for the receiver operating characteristic curve area in the presence of verification bias. (United States)

    Zhou, X H


    The efficacy of a diagnostic test can be represented by the area under the receiver operating characteristic (ROC) curve. In estimating the ROC curve area, a common problem is verification bias resulting from selectively verifying a subset of patients initially tested. This paper proposes a simple verification bias correction procedure for estimating the ROC curve area and its corresponding variance.

  15. Verification of the computer code ATHLET in the framework of the external verification group ATHLET BETHSY test 5.2c - total loss of feedwater. Final report; Verifikation des ATHLET-Rechenprogramms im Rahmen der externen Verifikationsgruppe ATHLET BETHSY Test 5.2c - Totalverlust des Speisewassers. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Krepper, E.; Schaefer, F. [Forschungszentrum Rossendorf e.V. (FZR) (Germany). Inst. fuer Sicherheitsforschung


    In the framework of the external validation of the thermohydraulic code ATHLET Mod 1.1 Cycle D, which has been developed by the GRS, post test analyses of two experiments were done, which were performed at the French integral test facility BETHSY. The BETHSY experiment 5.2c investigates the accident procedures in case of a total loss of feedwater at the steam generator secondary side. In such an accident the emergency cooling of the reactor core with primary bleed and feed, the behaviour of the steam generators in case of dry out and the long time behaviour of the test facility are special subjects of interest. During the experiment the high pressure injection system, the hydroaccumulators and the low pressure injection system were available. The evaluation of the calculated results shows, that all main phenomena can be calculated in a good quality compared with the experiment. Resulting from various calculations it should be noticed that the quality of the results strongly depends on the modelling of the heat losses of the facility, which were partly compensated by the trace heating. This trace heating was changed several times in the experiment to compensate the changing heat losses. The exact modelling of the resulting heat losses has a strong influence on the course of the whole transient. In this test the unsufficient modelling of the resulting heat losses may be the reason for deviations of the calculated transient from the observed transient. The results shows, that the safety relevant statement of the experiment could be reproduced by the code ATHLET. (orig.) [Deutsch] Im Rahmen der externen Validierung des von der Gesellschaft fuer Anlagen- und Reaktorsicherheit entwickelten Stoerfallcodes ATHLET, der in der Version Mod 1.1 Cycle D vorlag, wurden zwei Experimente nachgerechnet und analysiert, die an der franzoesischen Versuchsanlage BETHSY durchgefuehrt wurden. Das Experiment 5.2c dient der Untersuchung der Notfallprozeduren beim Totalausfall der

  16. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park


    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.


    The Environmental Technology Verification report discusses the technology and performance of the Clarus C Hydrogen Peroxide Gas Generator, a biological decontamination device manufactured by BIOQUELL, Inc. The unit was tested by evaluating its ability to decontaminate seven types...

  18. Method and computer product to increase accuracy of time-based software verification for sensor networks (United States)

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA


    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.


    The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...

  20. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.


    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  1. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support


    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  2. Automatic quality verification of the TV sets (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag


    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  3. Leak detection/verification

    Energy Technology Data Exchange (ETDEWEB)

    Krhounek, V.; Zdarek, J.; Pecinka, L. [Nuclear Research Institute, Rez (Czech Republic)


    Loss of coolant accident (LOCA) experiments performed as part of a Leak Before Break (LBB) analysis are very briefly summarized. The aim of these experiments was to postulate the leak rates of the coolant. Through-wall cracks were introduced into pipes by fatigue cycling and hydraulically loaded in a test device. Measurements included coolant pressure and temperature, quantity of leaked coolant, displacement of a specimen, and acoustic emission. Small cracks were plugged with particles in the coolant during testing. It is believed that plugging will have no effect in cracks with leak rates above 35 liters per minute. The leak rate safety margin of 10 is sufficient for cracks in which the leak rate is more than 5 liters per minute.

  4. Achievement report on developing superconductor power applied technologies in fiscal 1999 (2). Research and development of superconductor wire materials, research and development of superconductor power generators, research of total systems, research and development of freezing systems, and verification tests; 1999 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 2. Chodendo senzai no kenkyu kaihatsu / chodendo hatsudenki no kenkyu kaihatsu / total system no kenkyu / reito system no kenkyu kaihatsu / jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    With an objective to achieve higher efficiency, higher density, and higher stability in power systems, research and development has been performed on superconductor power generators. This paper summarizes the achievements thereof in fiscal 1999. A verification test was given on the rotor of an ultra high speed responding generator. In a sudden short circuit test using the different phase charging method, no anomalies were found such as quench generation and vibration changes, wherein the healthiness of the generator was verified. In the VVVF actuation test, knowledge was acquired on the actuation method when the ultra high speed responding generator is applied to a combined cycle plant. After the verification test has been completed, the disassembly inspections such as visual check and non-destructive test were performed. With regard to the vacuum leakage found in the rotor under very low temperatures, the causes were presumed and the countermeasures were discussed by observing the weld structures. In the design research, the conception design on the 200-MW pilot generator was reviewed by reflecting the results of the verification tests on the model generator. At the same time, trial design was made on a 600-MW target generator. In summarizing the overall research achievements, the achievements and evaluations were summarized on technological issues that have been allotted to each research member. (NEDO)


    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ


    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  6. Land surface Verification Toolkit (LVT) (United States)

    Kumar, Sujay V.


    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  7. Formal verification of AI software (United States)

    Rushby, John; Whitehurst, R. Alan


    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  8. Kleene Algebra and Bytecode Verification (United States)


    Bytecode 2005 Preliminary Version Kleene Algebra and Bytecode Verification Lucja Kot 1 Dexter Kozen 2 Department of Computer Science Cornell...first-order methods that inductively annotate program points with abstract values. In [6] we introduced a second-order approach based on Kleene algebra ...form a left-handed Kleene algebra . The dataflow labeling is not achieved by inductively labeling the program with abstract values, but rather by

  9. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan (United States)


    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  10. Z-2 Architecture Description and Requirements Verification Results (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard


    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag

  11. Field verification of CO sub 2 -foam

    Energy Technology Data Exchange (ETDEWEB)

    Martin, F.D.; Heller, J.P.; Weiss, W.W.


    In September 1989, the Petroleum Recovery Research Center (PRRC), a division of New Mexico Institute of Mining and Technology, received a grant from the US Department of Energy (DOE) for a project entitled Field Verification of CO{sub 2} Foam.'' The grant provided for an extension of the PRRC laboratory work to a field testing stage to be performed in collaboration with an oil producer actively conducting a CO{sub 2} flood. The objectives of this project are to: (1) conduct reservoir studies, laboratory tests, simulation runs, and field tests to evaluate the use of foam for mobility control or fluid diversion in a New Mexico CO{sub 2} flood, and (2) evaluate the concept of CO{sub 2}-foam in the field by using a reservoir where CO{sub 2} flooding is ongoing, characterizing the reservoir, modeling the process, and monitoring performance of the field test. Seven tasks were identified for the successful completion of the project: (1) evaluate and select a field site, (2) develop an initial site- specific plan, (3) conduct laboratory CO{sub 2}-foam mobility tests, (4) perform reservoir simulations, (5) design the foam slug, (6) implement a field test, and (7) evaluate results.

  12. Verification of Gyrokinetic codes: Theoretical background and applications (United States)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent


    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  13. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  14. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia


    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  15. What is the Final Verification of Engineering Requirements? (United States)

    Poole, Eric


    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  16. PROCEED and Crowd-sourced Formal Verification (United States)


    VA November 7, 2011 PROCEED and Crowd-sourced Formal Verification Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...TITLE AND SUBTITLE PROCEED and Crowd-sourced Formal Verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d... Formal Verification (CSFV) Approved for Public Release, Distribution Unlimited. The Problem Application specific functions Are there fundamental

  17. Formal Verification of Mathematical Software. Volume 2 (United States)


    copy RAfJC-TR-90-53, Vol I (of twol Final Techrical Report ?"ay 1990 AD-A223 633 FORMAL VERIFICATION OF MATHEMATICAL SOFTWARE DTIC ELECTE Odyssey...copies of this report unless contractual obligations or notices on a specific document require that it be returned. FORMAL VERIFICATION OF...1 May 1986 Contract Expiration Date: 31 July 1989 Short Title of Work: Formal Verification of SDI Mathematical Software Period of Work Covered: May 86

  18. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview? (United States)


    ... 49 Transportation 1 2010-10-01 2010-10-01 false What does the MRO tell the employee at the beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.135...

  19. The SeaHorn Verification Framework (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.


    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.


    Directory of Open Access Journals (Sweden)

    C. Williges


    Full Text Available The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs, one for the UV-VIS spectral range (305 nm … 500 nm, the second for NIR (750 nm … 775 nm. In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM which will also be used for the upcoming Flight Model (FM verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  1. Verification of the SENTINEL-4 Focal Plane Subsystem (United States)

    Williges, C.; Hohn, R.; Rossmann, H.; Hilbert, S.; Uhlig, M.; Buchwinkler, K.; Reulke, R.


    The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR) in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS) on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs), one for the UV-VIS spectral range (305 nm … 500 nm), the second for NIR (750 nm … 775 nm). In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM) which will also be used for the upcoming Flight Model (FM) verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  2. A methodology for the rigorous verification of Particle-in-Cell simulations (United States)

    Riva, Fabio; Beadle, Carrie F.; Ricci, Paolo


    A methodology to perform a rigorous verification of Particle-in-Cell (PIC) simulations is presented, both for assessing the correct implementation of the model equations (code verification) and for evaluating the numerical uncertainty affecting the simulation results (solution verification). The proposed code verification methodology is a generalization of the procedure developed for plasma simulation codes based on finite difference schemes that was described by Riva et al. [Phys. Plasmas 21, 062301 (2014)] and consists of an order-of-accuracy test using the method of manufactured solutions. The generalization of the methodology for PIC codes consists of accounting for numerical schemes intrinsically affected by statistical noise and providing a suitable measure of the distance between continuous, analytical distribution functions and finite samples of computational particles. The solution verification consists of quantifying both the statistical and discretization uncertainties. The statistical uncertainty is estimated by repeating the simulation with different pseudorandom number generator seeds. For the discretization uncertainty, the Richardson extrapolation is used to provide an approximation of the analytical solution and the grid convergence index is used as an estimate of the relative discretization uncertainty. The code verification methodology is successfully applied to a PIC code that numerically solves the one-dimensional, electrostatic, collisionless Vlasov-Poisson system. The solution verification methodology is applied to quantify the numerical uncertainty affecting the two-stream instability growth rate, which is numerically evaluated thanks to a PIC simulation.

  3. Formal Verification at System Level (United States)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.


    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  4. Standard Practices for Verification and Calibration of Polarimeters

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 Polarimeters and polariscopes used for measuring stress in glass are described in Test Methods F218, C148, and C978. These instruments include a light source and several optical elements (polarizers, optical retarders, filters, and so forth) that require occasional cleaning, realigning, and calibration. The objective of these practices is to describe the calibration and verification procedures required to maintain these instruments in calibration and ensure that the optical setup is within specification for satisfactory measurements. 1.2 It is mandatory throughout these practices that both verification and calibration are carried out by qualified personnel who fully understand the concepts used in measurements of stress retardation and are experienced in the practices of measuring procedures described in Test Methods F218, C148, and C978. 1.3 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.

  5. Verification steps for the CMS event-builder software

    CERN Multimedia

    CERN. Geneva


    The CMS event-builder software is used to assemble event fragments into complete events at 100 kHz. The data originates at the detector front-end electronics, passes through several computers and is transported from the underground to the high-level trigger farm on the surface. I will present the testing and verifications steps a new software version has to pass before it is deployed in production. I will discuss the current practice and possible improvements.

  6. The verification basis of the PM-ALPHA code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Angelini, S. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety


    An overall verification approach for the PM-ALPHA code is presented and implemented. The approach consists of a stepwise testing procedure focused principally on the multifield aspects of the premixing phenomenon. Breakup is treated empirically, but it is shown that, through reasonable choices of the breakup parameters, consistent interpretations of existing integral premixing experiments can be obtained. The present capability is deemed adequate for bounding energetics evaluations. (author)

  7. Probabilistic Verification of Multi-Robot Missions in Uncertain Environments (United States)


    sensory histories when verifying a robot mission. The third contribution is experimental validation results presented to show the effectiveness of...usability-tested [10] graphical programming frontend, as shown in Figure 1. Figure 1. MissionLab/VIPARS System Architecture . The VIPARS (Verification in...position. The sensory data is gen list, but the correspondence between a sen member, which originates from s(t), and th p(t) that generated the member can

  8. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    Energy Technology Data Exchange (ETDEWEB)

    Van Buren, Kendra L. [Los Alamos National Laboratory; Canfield, Jesse M. [Los Alamos National Laboratory; Hemez, Francois M. [Los Alamos National Laboratory; Sauer, Jeremy A. [Los Alamos National Laboratory


    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.


    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova


    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  10. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  11. 9 CFR 417.8 - Agency verification. (United States)


    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan; (b...

  12. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  13. 78 FR 58492 - Generator Verification Reliability Standards (United States)


    ... Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for Generator... (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser... Registry, NERC has registered 901 generator owners within the United States. Currently, synchronous...

  14. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  15. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi


    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  16. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.


    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  17. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus


    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  18. Oral rivaroxaban versus enoxaparin with vitamin K antagonist for the treatment of symptomatic venous thromboembolism in patients with cancer (EINSTEIN-DVT and EINSTEIN-PE): a pooled subgroup analysis of two randomised controlled trials. (United States)

    Prins, Martin H; Lensing, Anthonie W A; Brighton, Tim A; Lyons, Roger M; Rehm, Jeffrey; Trajanovic, Mila; Davidson, Bruce L; Beyer-Westendorf, Jan; Pap, Ákos F; Berkowitz, Scott D; Cohen, Alexander T; Kovacs, Michael J; Wells, Philip S; Prandoni, Paolo


    Patients with venous thromboembolism and cancer have a substantial risk of recurrent venous thromboembolism and bleeding during anticoagulant therapy. Although monotherapy with low-molecular-weight heparin is recommended in these patients, in clinical practice many patients with venous thromboembolism and cancer do not receive this treatment. We aimed to assess the efficacy and safety of a single-drug regimen with oral rivaroxaban compared with enoxaparin followed by vitamin K antagonists, in the subgroup of patients with cancer enrolled in the EINSTEIN-DVT and EINSTEIN-PE randomised controlled trials. We did a subgroup analysis of patients with active cancer (either at baseline or diagnosed during the study), a history of cancer, or no cancer who were enrolled in the EINSTEIN-DVT and EINSTEIN-PE trials. Eligible patients with deep-vein thrombosis (EINSTEIN-DVT) or pulmonary embolism (EINSTEIN-PE) were randomly assigned in a 1:1 ratio to receive rivaroxaban (15 mg twice daily for 21 days, followed by 20 mg once daily) or standard therapy (enoxaparin 1·0 mg/kg twice daily and warfarin or acenocoumarol; international normalised ratio 2·0-3·0). Randomisation with a computerised voice-response system was stratified according to country and intended treatment duration (3, 6, or 12 months). The prespecified primary efficacy and safety outcomes of both the trials and this subanalysis were symptomatic recurrent venous thromboembolism and clinically relevant bleeding, respectively. We did efficacy and mortality analyses in the intention-to-treat population, and bleeding analyses for time spent receiving treatment plus 2 days in the safety population (all patients who received at least one dose of study drug). The EINSTEIN-DVT and EINSTEIN-PE studies are registered at, numbers NCT00440193 and NCT00439777. In patients with active cancer (diagnosed at baseline or during treatment), recurrent venous thromboembolism occurred in 16 (5%) of 354 patients

  19. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng


    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  20. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  1. Achievement report on developing superconductor power applied technologies in fiscal 1999 (1). Research and development of superconductor wire materials, research and development of superconductor power generators, research of total systems, research and development of freezing systems, and verification tests; 1999 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 1. Chodendo senzai no kenkyu kaihatsu / chodendo hatsudenki no kenkyu kaihatsu / total system no kenkyu / reito system no kenkyu kaihatsu / jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    With an objective to achieve higher efficiency, higher density, and higher stability in power systems, research and development has been performed on superconductor power applied technologies. This paper summarizes the achievements thereof in fiscal 1999. In research and development of the superconductor wire materials, decrease in loss and increase in capacity of the conductors were progressed for the Nb{sub 3}Sn wire material, whereas its mechanical properties and stability were evaluated. In research and development of the superconductor generators, an ultra high speed responding generator was verified of its healthiness in a sudden short circuit test. A linkage test with an operating 77-kV system was performed, wherein verification was given that the superconductor generator can be operated stably against various disturbances. In research and development of the freezing systems, an improved system was structured, which achieved operation of 11,390 hours in a single system as a result of the high reliability of the oil-free structure. In the verification tests, the ultra high speed responding model generator was connect to the freezing system to give such tests as load test, onerous test, actuation test by using the M-G system, and 77-kV system linkage test. The functions, reliability, and durability of the system were verified, and different data were acquired. (NEDO)

  2. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel? (United States)

    Schaun, Gustavo Z


    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O2max) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O2) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O2max. Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O2 percentages well below V̇O2max. Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O2max; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  3. Survey of Verification and Validation Techniques for Small Satellite Software Development (United States)

    Jacklin, Stephen A.


    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  4. 40 CFR 1065.303 - Summary of required calibration and verifications (United States)


    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications § 1065.303 Summary of... installation, within 370 days before testing and after major maintenance. Electrical power: Upon initial... the vehicle, prior to the start of the field test, and after maintenance such as pre-filter changes...

  5. Verification of BOUT++ plus plus by the method of manufactured solutions

    DEFF Research Database (Denmark)

    Dudson, B. D.; Madsen, Jens; Omotani, J.


    . A verification exercise has been performed as part of a EUROfusion Enabling Research project, to rigorously test the correctness of the algorithms implemented in BOUT++, by testing order-of-accuracy convergence rates using the Method of Manufactured Solutions (MMS). We present tests of individual components...

  6. Retinal Verification Using a Feature Points-Based Biometric Pattern

    Directory of Open Access Journals (Sweden)

    M. Ortega


    Full Text Available Biometrics refer to identity verification of individuals based on some physiologic or behavioural characteristics. The typical authentication process of a person consists in extracting a biometric pattern of him/her and matching it with the stored pattern for the authorised user obtaining a similarity value between patterns. In this work an efficient method for persons authentication is showed. The biometric pattern of the system is a set of feature points representing landmarks in the retinal vessel tree. The pattern extraction and matching is described. Also, a deep analysis of similarity metrics performance is presented for the biometric system. A database with samples of retina images from users on different moments of time is used, thus simulating a hard and real environment of verification. Even in this scenario, the system allows to establish a wide confidence band for the metric threshold where no errors are obtained for training and test sets.

  7. Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern


    Full Text Available This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a Neo-empiricism and the gambling metaphor; (b Popperian falsificationism and the scientific tribunal metaphor; (c Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method for testing scientific hypotheses, respectively: (a Decision theoretic Bayesian statistics and Bayes factors; (b Frequentist statistics and p-values; (c Constructive Bayesian statistics and e-values. This article examines with special care the Zero Probability Paradox (ZPP, related to the verification of sharp or precise hypotheses. Finally, this article makes some remarks on Lakatos’ view of mathematics as a quasi-empirical science.

  8. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar


    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  9. Task-specific style verification (United States)

    Pataki, Norbert; Cséri, Tamás; Szügyi, Zalán


    Programming antipatterns are commonly used patterns that make the code unnecessary complex and unmaintainable. However, beginner programmers such as students, often use them. Usage of antipatterns should be eliminated from source code. Many antipatterns can be detected at compilation-time with an appropriate parser tool. In this paper we argue for a new lint-like tool that does detect typical programming antipatterns, and it is extensible to task-specific verifications. This tool mainly developed to evaluate students' programs, however it can be used in industrial projects as well. Our approach based on pattern matching on abstract syntax tree provided by Clang parser. We present our description language that specifies the antipatterns.

  10. Formal verification of algorithms for critical systems (United States)

    Rushby, John M.; Von Henke, Friedrich


    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  11. Report on results for fiscal 1997 on development of superconducting electric power application technology. Pt. 2. R and D of superconducting wire, R and D of superconducting generator, studies on total system, R and D of refrigeration system, and verification test; 1997 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total system no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    This report refers, continuously to the Part 1, to the in-situ verification test of the slow-response type model machine rotor and to the review of the conceptual design of the pilot machine. On the basis of the R and D results of various element technology/partial models obtained before the previous fiscal year, the design, manufacturing and factory test had been conducted for a 70,000kW class slow-response type model machine rotor. This year, an in-situ verification test was performed to complete the test of all planned test items. Using the technological results obtained in the design, manufacturing and test of the 70,000kW class model machine, the conceptual design is being reviewed of the 200,000kW class pilot machine. In the aspect of the functional design, accurate grasping of the thermal load is essential for the purpose of attaining a large capacity for a superconducting generator, as a part of which a thermal load analytical method was planned to be established for a torque tube heat exchanger. The reasonableness of the analysis was verified through a comparison with the factory test result of the 70,000kW class slow-response type rotor, indicating good agreement between the calculation result and the actual measurement, and enabling the result to be obtained that explains dependency of the thermal load on the number of revolution. (NEDO)

  12. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314 Tank Farm Restoration and Safe Operations

    Energy Technology Data Exchange (ETDEWEB)

    MCGREW, D.L.


    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate.

  13. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.


    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Privacy Preserving Iris Based Biometric Identity Verification

    Directory of Open Access Journals (Sweden)

    Przemyslaw Strzelczyk


    Full Text Available Iris biometrics is considered one of the most accurate and robust methods of identity verification. Individually unique iris features can be presented in a compact binary form easily compared with reference template to confirm identity. However, when templates or features are disclosed, iris biometrics is no longer suitable for verification. Therefore, there is a need to perform iris feature matching without revealing the features itself and reference template. The paper proposes an extension of the standard iris-based verification protocol that introduces features and a template locking mechanism, which guarantees that no sensitive information is exposed.Article in English

  15. Verification of turbine and governor model of hydro generator unit for the purpose of load frequency control system simulation

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane


    Full Text Available In this paper, verification of power plant component models is performed by evaluating the deviations between simulation results and test results of the component under study. The obtained deviations of the simulation results from the test results, as well as the acceptability of its value for a particular model and purpose, are determined in the process of verification. In this paper, the test results, the model and the verification procedure are presented. The basis for the model synthesis and verification are the test results derived from measurements conducted with the aim of deriving the operational performance of hydro unit R2 in PSP 'Bajina Bašta' connected to the load frequency control system.

  16. Report on results for fiscal 1997 on development of superconducting electric power application technology. Pt. 1. R and D of superconducting wire, R and D of superconducting generator, studies on total system, R and D of refrigeration system, and verification test; 1997 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 1. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total system no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    This report explains the outline as Part 1. In fiscal 1997, the 10th year of the project, a multi cylindrical rotary model for which an in-situ verification test was finished was brought back to the plant and dismantled for examination, while the in-situ verification test of a slow-response type model machine rotor was conducted in combination with a refrigeration system. In addition, in the research of AC wire materials and oxide based materials, studies were made with a purpose of high characterization and long wire materialization. In the metallic materials, a 10kANbTi conductor was developed while, in oxide-based materials, research was done on performance improvement and wire materialization based on various synthesizing methods. The manufacturing, factory test and in-situ text were conducted for a 70,000kW model machine with the purpose of R and D of a 200,000kW class pilot machine. Examination was made on the test method of the 70,000kW class model machine, operation technology of a superconducting generation system, and the effect of introducing the superconducting generator into a power system. In the conventional refrigeration system, a single unit test was carried out for the liquefaction, liquid storing capacity, etc., of the system. The 70,000kW class model machine was put through a test for confirming the general operation including the refrigeration system. (NEDO)

  17. Thermal Analyses and Verification for HAUSAT-2 Small Satellite

    Directory of Open Access Journals (Sweden)

    Mi-Hyeon Lee


    Full Text Available HAUSAT-2 is nano satellite with 25 kg mass being developed by Space System Research Lab. in Hnakuk Aviation University. This paper addresses HAUSAT-2 small satellite thermal analyses and its verification at satellite system, electronic box, and PCB levels. Thermal model which is used for system-level and box-level thermal analyses was verified and corrected through thermal vacuum/balance test. The new board-level thermal analysis methodology, modelling high-power dissipating EEE parts directly, was proposed. The proposed methodology has been verified with test results.

  18. The verification basis of the ESPROSE.m code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Freeman, K.; Chen, X. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety


    An overall verification approach for the ESPROSE.m code is presented and implemented. The approach consists of a stepwise testing procedure from wave dynamics aspects to explosion coupling at the local level, and culminates with the consideration of propagating explosive events. Each step in turn consists of an array of analytical and experimental tests. The results indicate that, given the premixture composition, the prediction of energetics of large scale explosions in multidimensional geometries is within reach. The main need identified is for constitutive laws for microinteractions with reactor materials; however, reasonably conservative assessments are presently possible. (author)

  19. Standard Verification System Lite (SVS Lite) (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  20. Tackling Verification and Validation for Prognostics (United States)

    National Aeronautics and Space Administration — Verification and validation (V&V) has been identified as a critical phase in fielding systems with Integrated Systems Health Management (ISHM) solutions to...


    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  2. Language dependence in multilingual speaker verification

    CSIR Research Space (South Africa)

    Kleynhans, NT


    Full Text Available An investigation into the performance of current speaker verification technology within a multilingual context is presented. Using the Oregon Graduate Institute (OGI) Multi-Language Telephone Speech Corpus (MLTS) database, the authors found...

  3. Data Exchanges and Verifications Online (DEVO) (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  4. Polarimetric and Interferometric SAR Calibration Verification Methods (United States)

    Kim, Y.; Zyl, J van


    It is necessary to calibrate SAR data in order to use the data for science applications. When both polarimetric and interferometric data are collected simultaneously, these SAR data can be used for cross-calibration and verification.

  5. Procedure Verification and Validation Toolset Project (United States)

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  6. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.


    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  7. Guidance and Control Software Project Data - Volume 3: Verification Documents (United States)

    Hayhurst, Kelly J. (Editor)


    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  8. Model Checking, Abstraction, and Compositional Verification (United States)


    a mathematical model of the design is proved to satisfy a precise specification. Model checking is one formal verification technique. It consists of...involving the sequencing of events in time. One of the main drawbacks of model checking is the state explosion problem. This problem occurs in systems...considers two methods for avoiding the state explosion problem in the context of model checking : compositional verification and abstraction

  9. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay


    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  10. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev


    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  11. Scalable hardware verification with symbolic simulation

    CERN Document Server

    Bertacco, Valeria


    An innovative presentation of the theory of disjoint support decomposition, presenting novel results and algorithms, plus original and up-to-date techniques in formal verificationProvides an overview of current verification techniques, and unveils the inner workings of symbolic simulationFocuses on new techniques that narrow the performance gap between the complexity of digital systems and the limited ability to verify themAddresses key topics in need of future research.

  12. CGMF & FREYA Verification in MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    At the present time, the new and updated fission event generators included in MCNP6.2 have been verified to be functioning properly through a variety of detailed tests. This work describes the detailed verification steps taken to ensure these complicated fission event generators, FREYA and CGMF, are integrated into MCNP6 properly. Ultimately, with the knowledge that MCNP6 is making use of these models appropriately, we can now begin to validate the models against benchmarked experiments. Some benchmarks, including criticality and subcritical experiments interested in multiplication and bulk counting rates, are easy to model and understand but are likely insensitive to the detailed nature of these models. It will take some new measurements with coincidence detection capabilities to be able to stress the physics within each of these fission event generator models. Once the models are validated and it is understood where the models can truly be predictive, then we can study what SNM observables can be characterized for nonproliferation applications.

  13. Online Signature Verification Using Fourier Descriptors

    Directory of Open Access Journals (Sweden)

    Berrin Yanikoglu


    Full Text Available We present a novel online signature verification system based on the Fast Fourier Transform. The advantage of using the Fourier domain is the ability to compactly represent an online signature using a fixed number of coefficients. The fixed-length representation leads to fast matching algorithms and is essential in certain applications. The challenge on the other hand is to find the right preprocessing steps and matching algorithm for this representation. We report on the effectiveness of the proposed method, along with the effects of individual preprocessing and normalization steps, based on comprehensive tests over two public signature databases. We also propose to use the pen-up duration information in identifying forgeries. The best results obtained on the SUSIG-Visual subcorpus and the MCYT-100 database are 6.2% and 12.1% error rate on skilled forgeries, respectively. The fusion of the proposed system with our state-of-the-art Dynamic Time Warping (DTW system lowers the error rate of the DTW system by up to about 25%. While the current error rates are higher than state-of-the-art results for these databases, as an approach using global features, the system possesses many advantages. Considering also the suggested improvements, the FFT system shows promise both as a stand-alone system and especially in combination with approaches that are based on local features.

  14. Lidar profilers in the context of wind energy–a verification procedure for traceable measurements

    DEFF Research Database (Denmark)

    Gottschall, Julia; Courtney, Michael; Wagner, Rozenn


    a repeatable test. Second, a linear regression is applied to the data for each height. The third step is a bin-average analysis of the lidar error, i.e. the difference between the lidar and reference measurements, forming the basis for the ensuing uncertainty estimation. The results of the verification test...

  15. Achievement report for fiscal 1996 on the research and development of superconductor technology to power generation. Pt. 1. Research and development of superconducting wire, generator, total system, and refrigeration system; and verification test; 1996 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 1. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total system no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    In the research and development of superconducting wires, studies are conducted to increase the current capacity of NbTi and Nb{sub 3}Sn metal wires and to improve their properties, and to increase the current capacity of oxide materials and improve their performance making full use of the features of each manufacturing method. In the development of superconducting generators, a slow excitation response type is tested for verification, and a good result is attained; and a quick excitation response type is tested for field winding static excitation, and good performance is exhibited. Using the results so far achieved, the 200,000kW class pilot machine concept design is reviewed. In the study of total systems, feasibility is studied of a quench test for the 70,000kW class machine through simulation analyses, etc. In the development of refrigeration systems, efforts are exerted to improve on the conventional type in terms of reliability and to further improve on the improved version in terms of performance and space-saving feature. One of the endeavors involves the development of a He Brayton cycle turbine driven compressor. A multilayer cylindrical rotor is verified in terms of functions, characteristics, reliability and durability, and various data are collected toward the development of a pilot machine. (NEDO)

  16. Fitting Noise Management Signal Processing Applying the American Academy of Audiology Pediatric Amplification Guideline: Verification Protocols. (United States)

    Scollie, Susan; Levy, Charla; Pourmand, Nazanin; Abbasalipour, Parvaneh; Bagatto, Marlene; Richert, Frances; Moodie, Shane; Crukley, Jeff; Parsa, Vijay


    Although guidelines for fitting hearing aids for children are well developed and have strong basis in evidence, specific protocols for fitting and verifying some technologies are not always available. One such technology is noise management in children's hearing aids. Children are frequently in high-level and/or noisy environments, and many options for noise management exist in modern hearing aids. Verification protocols are needed to define specific test signals and levels for use in clinical practice. This work aims to (1) describe the variation in different brands of noise reduction processors in hearing aids and the verification of these processors and (2) determine whether these differences are perceived by 13 children who have hearing loss. Finally, we aimed to develop a verification protocol for use in pediatric clinical practice. A set of hearing aids was tested using both clinically available test systems and a reference system, so that the impacts of noise reduction signal processing in hearing aids could be characterized for speech in a variety of background noises. A second set of hearing aids was tested across a range of audiograms and across two clinical verification systems to characterize the variance in clinical verification measurements. Finally, a set of hearing aid recordings that varied by type of noise reduction was rated for sound quality by children with hearing loss. Significant variation across makes and models of hearing aids was observed in both the speed of noise reduction activation and the magnitude of noise reduction. Reference measures indicate that noise-only testing may overestimate noise reduction magnitude compared to speech-in-noise testing. Variation across clinical test signals was also observed, indicating that some test signals may be more successful than others for characterization of hearing aid noise reduction. Children provided different sound quality ratings across hearing aids, and for one hearing aid rated the sound

  17. Field verification program for small wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Windward Engineering, LLC


    In 1999 Windward Engineering (Windward) was awarded a Cooperative Agreement under the Field Verification Program with the Department of Energy (DOE) to install two Whisper H40 wind turbines, one at the NREL National Wind Technology Center (NWTC) and one at a test site near Spanish Fork, Utah. After installation, the turbine at the NWTC was to be operated, maintained, and monitored by NREL while the turbine in Spanish Fork was to be administered by Windward. Under this award DOE and Windward defined the primary objectives of the project as follows: (1) Determine and demonstrate the reliability and energy production of a furling wind turbine at a site where furling will be a very frequent event and extreme gusts can be expected during the duration of the tests. (2) Make engineering measurements and conduct limited computer modeling of the furling behavior to improve the industry understanding of the mechanics and nature of furling. We believe the project has achieved these objectives. The turbine has operated for approximately three and a half years. We have collected detailed engineering data approximately 75 percent of that time. Some of these data were used in an ADAMS model validation that highlighted the accuracies and inaccuracies of the computer modeling for a passively furling wind turbine. We also presented three papers at the American Wind Energy Association (AWEA) Windpower conferences in 2001, 2002, and 2003. These papers addressed the following three topics: (a) general overview of the project [1], (b) furling operation during extreme wind events [2], and (c) extrapolation of extreme (design) loads [3]. We believe these papers have given new insight into the mechanics and nature of furling and have set the stage for future research. In this final report we will highlight some of the more interesting aspects of the project as well as summarize the data for the entire project. We will also present information on the installation of the turbines as well as

  18. Thrombelastography detects dabigatran at therapeutic concentrations in vitro to the same extent as gold-standard tests

    DEFF Research Database (Denmark)

    Solbeck, Sacha; Ostrowski, Sisse R; Stensballe, Jakob


    BACKGROUND/OBJECTIVES: Dabigatran is an oral anticoagulant approved for treatment of non-valvular atrial fibrillation, deep venous thrombosis (DVT), pulmonary embolism and prevention of DVT following orthopedic surgery. Monitoring of the dabigatran level is essential in trauma and bleeding patien...... to the current gold-standard tests Hemoclot and ECT, for assessing dabigatran. TEG R is applicable as a rapid and precise whole blood monitoring test for dabigatran treated patients in the emergency setting....... but the available plasma-based assays may not sufficiently display its hemostatic effect. This study investigated the in vitro effect of different concentrations of dabigatran on whole blood thrombelastography (TEG) and its correlation to the specific but time-consuming plasma-based tests Hemoclot and Ecarin...


    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...


    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy


    Energy Technology Data Exchange (ETDEWEB)



    The principal objectives of the DM1200 melter tests were to determine the effects of feed rheology, feed solid content, and bubbler configuration on glass production rate and off-gas system performance while processing the HLW AZ-101 and C-106/AY-102 feed compositions; characterize melter off-gas emissions; characterize the performance of the prototypical off-gas system components, as well as their integrated performance; characterize the feed, glass product, and off-gas effluents; and perform pre- and post test inspections of system components. The specific objectives (including test success criteria) of this testing, along with how each objective was met, are outlined in a table. The data provided in this Final Report address the impacts of HLW melter feed rheology on melter throughput and validation of the simulated HLW melter feeds. The primary purpose of this testing is to further validate/verify the HLW melter simulants that have been used for previous melter testing and to support their continued use in developing melter and off-gas related processing information for the Project. The primary simulant property in question is rheology. Simulants and melter feeds used in all previous melter tests were produced by direct addition of chemicals; these feed tend to be less viscous than rheological the upper-bound feeds made from actual wastes. Data provided here compare melter processing for the melter feed used in all previous DM100 and DM1200 tests (nominal melter feed) with feed adjusted by the feed vendor (NOAH Technologies) to be more viscous, thereby simulating more closely the upperbounding feed produced from actual waste. This report provides results of tests that are described in the Test Plan for this work. The Test Plan is responsive to one of several test objectives covered in the WTP Test Specification for this work; consequently, only part of the scope described in the Test Specification was addressed in this particular Test Plan. For the purpose of

  2. Guidelines for Sandia ASCI Verification and Validation Plans - Content and Format: Version 1.0

    Energy Technology Data Exchange (ETDEWEB)



    This report summarizes general guidelines for the development of Verification and Validation (V and V) plans for ASCI code projects at Sandia National Laboratories. The main content categories recommended by these guidelines for explicit treatment in Sandia V and V plans are (1) stockpile drivers influencing the code development project (2) the key phenomena to be modeled by the individual code; (3) software verification strategy and test plan; and (4) code validation strategy and test plans. The authors of this document anticipate that the needed content of the V and V plans for the Sandia ASCI codes will evolve as time passes. These needs will be reflected by future versions of this document.

  3. Verification of Kaplan turbine cam curves realization accuracy at power plant

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane


    Full Text Available Sustainability of approximately constant value of Kaplan turbine efficiency, for relatively large net head changes, is a result of turbine runner variable geometry. Dependence of runner blades position change on guide vane opening represents the turbine cam curve. The cam curve realization accuracy is of great importance for the efficient and proper exploitation of turbines and consequently complete units. Due to the reasons mentioned above, special attention has been given to the tests designed for cam curves verification. The goal of this paper is to provide the description of the methodology and the results of the tests performed in the process of Kaplan turbine cam curves verification.

  4. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons


    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  5. A novel dynamic acoustical model for speaker verification (United States)

    Li, Gongjun; Espy-Wilson, Carol


    In speaker verification, the conventional acoustical models (hidden Markov model and vector quantization) are not able to capture a speaker's dynamic characteristics. In this paper we describe a novel dynamic acoustical model. The training data are viewed as a concatenation of many speech-pattern samples, and the pattern matching involves a comparison of the pattern samples and the test speech. To reduce the amount of computation, a tree is generated to index the entrance to pattern samples using an expectation and maximization (EM) approach, and leaves in the tree are employed to quantize the feature vectors in the training data. The obtained leaf-number sequences are exploited in pattern matching as a temporal model. We use a DTW scheme and a GMM scheme to match the training data and the test speech. Experimental results on NIST'98 speaker recognition evaluation data show that the accuracy of speaker verification on 3- and 10-s test speech is raised from 71.1% and 75.2% for a baseline GMM-based system to 80.0% and 82.1% for the dynamic acoustical model, respectively. Furthermore, some pattern samples in the training data are correctly tracked by the test speech.

  6. Solar Array Verification Analysis Tool (SAVANT) Developed (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert


    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  7. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer


    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  8. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael


    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...... the additional benefit of generating SSA as a side effect, which may be immediately useful for a subsequent dynamic compilation stage....

  9. Performing Verification and Validation in Reuse-Based Software Engineering (United States)

    Addy, Edward A.


    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  10. Status and Verification of Edge Plasma Turbulence Code BOUT

    Energy Technology Data Exchange (ETDEWEB)

    Umansky, M V; Xu, X Q; Dudson, B; LoDestro, L L; Myra, J R


    The BOUT code is a detailed numerical model of tokamak edge turbulence based on collisional plasma uid equations. BOUT solves for time evolution of plasma uid variables: plasma density N{sub i}, parallel ion velocity V{sub {parallel}i}, electron temperature T{sub e}, ion temperature T{sub i}, electric potential {phi}, parallel current j{sub {parallel}}, and parallel vector potential A{sub {parallel}}, in realistic 3D divertor tokamak geometry. The current status of the code, physics model, algorithms, and implementation is described. Results of verification testing are presented along with illustrative applications to tokamak edge turbulence.

  11. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E


    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  12. HADES: Microprocessor Hazard Analysis via Formal Verification of Parameterized Systems

    Directory of Open Access Journals (Sweden)

    Lukáš Charvát


    Full Text Available HADES is a fully automated verification tool for pipeline-based microprocessors that aims at flaws caused by improperly handled data hazards. It focuses on single-pipeline microprocessors designed at the register transfer level (RTL and deals with read-after-write, write-after-write, and write-after-read hazards. HADES combines several techniques, including data-flow analysis, error pattern matching, SMT solving, and abstract regular model checking. It has been successfully tested on several microprocessors for embedded applications.

  13. Test

    DEFF Research Database (Denmark)

    Bendixen, Carsten


    Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....

  14. High-level Synthesis Integrated Verification

    Directory of Open Access Journals (Sweden)

    M. Dossis


    Full Text Available It is widely known in the engineering community that more than 60% of the IC design project time is spent on verification. For the very complex contemporary chips, this may prove prohibitive for the IC to arrive at the correct time in the market and therefore, valuable sales share may be lost by the developing industry. This problem is deteriorated by the fact that most of conventional verification flows are highly repetitive and a great proportion of the project time is spent on last-moment simulations. In this paper we present an integrated approach to rapid, high-level verification, exploiting the advantages of a formal High-level Synthesis tool, developed by the author. Verification in this work is supported at 3 levels: high-level program code, RTL simulation and rapid, generated C testbench execution. This paper is supported by strong experimental work with 3-4 popular design synthesis and verification that proves the principles of our methodology.

  15. Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler (United States)

    The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...

  16. Use of prestudy heparin did not influence the efficacy and safety of rivaroxaban in patients treated for symptomatic venous thromboem-bolism in the EINSTEIN DVT and EINSTEIN PE studies. (United States)

    Prandoni, Paolo; Prins, Martin H; Cohen, Alexander T; Müller, Katharina; Pap, Ákos F; Tewes, Miriam C; Lensing, Anthonie W A


    In the EINSTEIN DVT and EINSTEIN PE studies, the majority of patients received heparins to bridge the period during venous thromboembolism (VTE) diagnosis confirmation and the start of the study. In contrast to vitamin K antagonists (VKAs), rivaroxaban may not require initial heparin treatment. To evaluate the effect of prestudy heparin on the efficacy and safety of rivaroxaban relative to enoxaparin/VKA, the 3-month incidence of recurrent VTE, and the 14-day incidence of major and nonmajor clinically relevant bleeding were compared in patients who did and did not receive prestudy heparin. Of the 8,281 patients randomized, 6,937 (83.8%) received prestudy heparin (mean ± SD duration = rivaroxaban: 1.04 [± 0.74] days; enoxaparin 1.03 [± 0.42] days), and 1,344 (16.2%) did not. In patients who did not receive prestudy heparin, the incidences of recurrent VTE were similar in rivaroxaban (15 of 649, 2.3%) and enoxaparin/VKA (13 of 695, 1.9%) patients (adjusted hazard ratio [HR] = 1.11; 95% confidence interval [CI] = 0.52 to 2.37). The incidences of recurrent VTE were also similar in rivaroxaban (54 of 3,501, 1.5%) and enoxaparin/VKA (69 of 3,436, 2.0%) patients who did receive prestudy heparin (adjusted HR = 0.74; 95% CI = 0.52 to 1.06; pinteraction  = 0.32). The incidences of major or nonmajor clinically relevant bleeding with rivaroxaban were not significantly different from those with enoxaparin/VKA, either with (105 of 3,485, 3.0% vs. 104 of 3,428, 3.0%; adjusted HR = 0.98; 95% CI = 0.75 to 1.29) or without (24 of 645, 3.7% vs. 30 of 688, 4.4%; adjusted HR = 0.81; 95% CI = 0.46 to 1.40; pinteraction  = 0.68) prestudy heparin. Although the majority of patients in the EINSTEIN studies received prestudy heparin, there were no notable differences in treatment effect of rivaroxaban versus enoxaparin/VKA in those who did and did not receive it. © 2015 by the Society for Academic Emergency Medicine.

  17. Simulator Semantics for System Level Formal Verification

    Directory of Open Access Journals (Sweden)

    Toni Mancini


    Full Text Available Many simulation based Bounded Model Checking approaches to System Level Formal Verification (SLFV have been devised. Typically such approaches exploit the capability of simulators to save computation time by saving and restoring the state of the system under simulation. However, even though such approaches aim to (bounded formal verification, as a matter of fact, the simulator behaviour is not formally modelled and the proof of correctness of the proposed approaches basically relies on the intuitive notion of simulator behaviour. This gap makes it hard to check if the optimisations introduced to speed up the simulation do not actually omit checking relevant behaviours of the system under verification. The aim of this paper is to fill the above gap by presenting a formal semantics for simulators.


    Directory of Open Access Journals (Sweden)

    L.F. Zhandarova


    Full Text Available 80 case histories of patients with breast cancer were analyzed. During the preoperative examination with objective and instrumental examination methods used the malignant process was suspected but no morphological verification was received. Physical examination revealed 75% cases of cancer. Roentgenologic evidence of malignant tumor was found in 43.5% women. Ultrasound examination of mammary glands showed that 57.7% of patients had cancer symptoms. Despite the repeated puncture aspiration biopsy, preoperative morphological examination proved to be negative. The reasons of morphological verification failure are connected with technical difficulties and morphological features of tumor structure. Negative malignant process verification necessitated the diagnostic partial mastectomy. To achieve ablasticity ofexcisional biopsyit is necessary to keep 2 cm from the tumor. Staged morphological diagnosis verifies the diagnosis in all patients, allowing to choose the adequate extentof surgical procedures.

  19. Constraint Specialisation in Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick


    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query-answer transformation of a given set of clauses and a goal. The effect is to propagate the constraints from the goal top-down and p......We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query-answer transformation of a given set of clauses and a goal. The effect is to propagate the constraints from the goal top...... results on verification problems show that this is an effective transformation, both in our own verification tools (convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  20. Constraint specialisation in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick


    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute a speciali......We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute...... underlying the clauses. Experimental results on verification problems show that this is an effective transformation, both in our own verification tools (based on a convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  1. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.


    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  2. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.


    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  3. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A|info:eu-repo/dai/nl/255170653

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  4. Radiation Biology Irradiator Dose Verification Survey. (United States)

    Pedersen, Kurt H; Kunugi, Keith A; Hammer, Clifford G; Culberson, Wesley S; DeWerd, Larry A


    Interest in standardized dosimetry for radiobiological irradiators has expanded over the last decade. At a symposium held at NIST, "The Importance of Standardization of Dosimetry in Radiobiology", a set of 12 criteria necessary for adequate irradiation was developed by the authors. Here we report on our review of dosimetry methods from various peer-reviewed publications and found that none of them satisfied all 12 criteria set forth by the authors of the NIAD/NCI/NIST proceedings. The inadequate reporting of dosimetry methods in the literature raises questions regarding the accuracy of the dose delivered to animal test subjects and the resulting experimental results. For this reason, we investigated the level of accuracy of dose delivery in radiation biology studies. We performed an irradiator output verification study of 12 radiation biology laboratories (7 gamma and 5 X-ray units) using polymethyl methacrylate (PMMA) mouse phantoms and thermoluminescent dosimeters (TLDs) readouts at the University of Wisconsin Medical Radiation Research Center (UWMRRC). The laboratories housing each of these irradiators were asked to deliver specific doses to individual mouse phantoms. Simultaneously, mouse phantoms at the UWMRRC were irradiated with NIST-traceable reference beams representative of the subject laboratories' beam energies. The irradiated mouse phantoms were returned from the various institutions to the UWMRRC and the TLDs were processed, with their measured dose response compared to the known dose response of the calibration phantom TLDs. Of the five facilities using X-ray irradiators, only one delivered an output within 5% of the target dose. The dose differences for the other four X-ray irradiators ranged from 12 to 42%. These results indicate the potential need for standardization of dose determination and additional oversight of radiobiology investigations.

  5. The formal verification of generic interpreters (United States)

    Windley, P.; Levitt, K.; Cohen, G. C.


    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  6. On Backward-Style Anonymity Verification (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  7. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand


    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  8. Towards Verification and Validation for Increased Autonomy (United States)

    Giannakopoulou, Dimitra


    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  9. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program (United States)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby


    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  10. Alignment Verification in the Early Stage of Service Design


    Tapandjieva, Gorica; Filipponi, Matteo; Wegmann, Alain


    Verification is a costly task, sometimes burdensome and tedious, requiring strong formal background. To reduce the effort and cost invested in verification, we developed a model-driven approach for automatic verification of service properties, done in the early service design phase. Our approach is based on SEAM, a service modeling method, and it incorporates a verification system called Leon. With our approach service designers do not need substantial understanding of specific formal and ver...

  11. Towards an automated system for the verification and diagnosis of intelligent VLSI circuits (United States)

    Velazco, Raoul; Ziade, Haissam

    The main features of a system designed to cope with both the verification and diagnosis of Very Large Scale Integration (VLSI) intelligent circuits are detailed. The system is composed of a validation program generator, the GAPT (French Acronym for automatic generation of test programs) software and a microprocessor dedicated verification system, the TEMAC functional tester. GAPT/TEMAC tools allow an easy implementation of a top down diagnosis procedure. Each diagnosis action is composed of symptom analysis, malfunction hypothesis statement, sequence generation, execution, and result evaluation. It was successfully used in various microprocessor qualification/validation experiments. The system capabilities and the diagnosis procedure are illustrated by an actual 68000 microprocessor diagnosis experiment.

  12. Further optimisations of constant Q cepstral processing for integrated utterance and text-dependent speaker verification

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md


    Many authentication applications involving automatic speaker verification (ASV) demand robust performance using short-duration, fixed or prompted text utterances. Text constraints not only reduce the phone-mismatch between enrollment and test utterances, which generally leads to improved...... performance, but also provide an ancillary level of security. This can take the form of explicit utterance verification (UV). An integrated UV + ASV system should then verify access attempts which contain not just the expected speaker, but also the expected text content. This paper presents such a system...

  13. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth


    The development of modern railway and tramway control systems represents a considerable challenge to both systems and software engineers: The goal to increase the traffic throughput while at the same time increasing the availability and reliability of railway operations leads to a demand for more...... in a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for automated...... construction and verification of railway control systems....

  14. Systemverilog for verification a guide to learning the testbench language features

    CERN Document Server

    Spear, Chris


    Based on the highly successful second edition, this extended edition of SystemVerilog for Verification: A Guide to Learning the Testbench Language Features teaches all verification features of the SystemVerilog language, providing hundreds of examples to clearly explain the concepts and basic fundamentals. It contains materials for both the full-time verification engineer and the student learning this valuable skill. In the third edition, authors Chris Spear and Greg Tumbush start with how to verify a design, and then use that context to demonstrate the language features,  including the advantages and disadvantages of different styles, allowing readers to choose between alternatives. This textbook contains end-of-chapter exercises designed to enhance students’ understanding of the material. Other features of this revision include: New sections on static variables, print specifiers, and DPI from the 2009 IEEE language standard Descriptions of UVM features such as factories, the test registry, and the config...

  15. Flexible prototype of modular multilevel converters for experimental verification of DC transmission and multiterminal systems

    DEFF Research Database (Denmark)

    Konstantinou, Georgios; Ceballos, Salvador; Gabiola, Igor


    Testing and verification of high-level and low-level control, modulation, fault handling and converter co-ordination for modular multilevel converters (MMCs) requires development of experimental prototype converters. In this paper, we provide a a complete overview of the MMC-based experimental...


    Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...

  17. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov


    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  18. 30 CFR 50.41 - Verification of reports. (United States)


    ..., INJURIES, ILLNESSES, EMPLOYMENT, AND COAL PRODUCTION IN MINES Maintenance of Records; Verification of Information § 50.41 Verification of reports. Upon request by MSHA, an operator shall allow MSHA to inspect and... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Verification of reports. 50.41 Section 50.41...

  19. 19 CFR 351.307 - Verification of information. (United States)


    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Verification of information. 351.307 Section 351... COUNTERVAILING DUTIES Information and Argument § 351.307 Verification of information. (a) Introduction. Prior to... verify relevant factual information. This section clarifies when verification will occur, the contents of...

  20. 7 CFR 1260.550 - Verification of information. (United States)


    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Verification of information. 1260.550 Section 1260... Cattlemen's Beef Promotion and Research Board § 1260.550 Verification of information. The Secretary may require verification of the information to determine eligibility for certification to make nominations...