Sample records for verification test series

  1. Test tasks for verification of program codes for calculation of neutron-physical characteristics of the BN series reactors (United States)

    Tikhomirov, Georgy; Ternovikh, Mikhail; Smirnov, Anton; Saldikov, Ivan; Bahdanovich, Rynat; Gerasimov, Alexander


    System of test tasks is presented with the fast reactor BN-1200 with nitride fuel as prototype. The system of test tasks includes three test based on different geometric models. Model of fuel element in homogeneous and in heterogeneous form, model of fuel assembly in height-heterogeneous and full heterogeneous form, and modeling of the active core of BN-1200 reactor. Cross-verification of program codes was performed. Transition from simple geometry to more complex one allows to identify the causes of discrepancies in the results during the early stage of cross-verification of codes. This system of tests can be applied for certification of engineering programs based on the method of Monte Carlo to the calculation of full-scale models of the reactor core of the BN series. The developed tasks take into account the basic layout and structural features of the reactor BN-1200. They are intended for study of neutron-physical characteristics, estimation of influence of heterogeneous structure and influence of diffusion approximation. The development of system of test tasks allowed to perform independent testing of programs for calculation of neutron-physical characteristics: engineering programs JARFR and TRIGEX, and codes MCU, TDMCC, and MMK based on the method of Monte Carlo.

  2. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra


    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.


    Energy Technology Data Exchange (ETDEWEB)

    Moran, B


    We present analytic solutions to two test problems that can be used to check the hydrodynamic implementation in computer codes designed to calculate the propagation of shocks in spherically convergent geometry. Our analysis is restricted to fluid materials with constant bulk modulus. In the first problem we present the exact initial acceleration and pressure gradient at the outer surface of a sphere subjected to an exponentially decaying pressure of the form P(t) = P{sub 0}e{sup -at}. We show that finely-zoned hydro-code simulations are in good agreement with our analytic solution. In the second problem we discuss the implosions of incompressible spherical fluid shells and we present the radial pressure profile across the shell thickness. We also discuss a semi-analytic solution to the time-evolution of a nearly spherical shell with arbitrary but small initial 3-dimensional (3-D) perturbations on its inner and outer surfaces.

  4. Palmprint Verification Using Time Series Method

    Directory of Open Access Journals (Sweden)

    A. A. Ketut Agung Cahyawan Wiranatha


    Full Text Available The use of biometrics as an automatic recognition system is growing rapidly in solving security problems, palmprint is one of biometric system which often used. This paper used two steps in center of mass moment method for region of interest (ROI segmentation and apply the time series method combined with block window method as feature representation. Normalized Euclidean Distance is used to measure the similarity degrees of two feature vectors of palmprint. System testing is done using 500 samples palms, with 4 samples as the reference image and the 6 samples as test images. Experiment results show that this system can achieve a high performance with success rate about 97.33% (FNMR=1.67%, FMR=1.00 %, T=0.036.

  5. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack,


    Energy Technology Data Exchange (ETDEWEB)

    Aleman, S


    The PORFLOW software package is a comprehensive mathematical model for simulation of multi-phase fluid flow, heat transfer and mass transport in variably saturated porous and fractured media. PORFLOW can simulate transient or steady-state problems in Cartesian or cylindrical geometry. The porous medium may be anisotropic and heterogeneous and may contain discrete fractures or boreholes with the porous matrix. The theoretical models within the code provide a unified treatment of concepts relevant to fluid flow and transport. The main features of PORFLOW that are relevant to Performance Assessment modeling at the Savannah River National Laboratory (SRNL) include variably saturated flow and transport of parent and progeny radionuclides. This document involves testing a relevant sample of problems in PORFLOW and comparing the outcome of the simulations to analytical solutions or other commercial codes. The testing consists of the following four groups. Group 1: Groundwater Flow; Group 2: Contaminant Transport; Group 3: Numerical Dispersion; and Group 4: Keyword Commands.

  7. Infrared scanner concept verification test report (United States)

    Bachtel, F. D.


    The test results from a concept verification test conducted to assess the use of an infrared scanner as a remote temperature sensing device for the space shuttle program are presented. The temperature and geometric resolution limits, atmospheric attenuation effects including conditions with fog and rain, and the problem of surface emissivity variations are included. It is concluded that the basic concept of using an infrared scanner to determine near freezing surface temperatures is feasible. The major problem identified is concerned with infrared reflections which result in significant errors if not controlled. Action taken to manage these errors result in design and operational constraints to control the viewing angle and surface emissivity.

  8. Design, analysis, and test verification of advanced encapsulation system (United States)

    Garcia, A.; Minning, C.


    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  9. Environmental Technology Verification--Baghouse Filtration Products: GE Energy QG061 Filtration Media (Tested September 2008) (United States)

    This report reviews the filtration and pressure drop performance of GE Energy's QG061 filtration media. Environmental Technology Verification (ETV) testing of this technology/product was conducted during a series of tests in September 2008. The objective of the ETV Program is to ...

  10. Fracture mechanics life analytical methods verification testing (United States)

    Favenesi, J. A.; Clemons, T. G.; Riddell, W. T.; Ingraffea, A. R.; Wawrzynek, P. A.


    The objective was to evaluate NASCRAC (trademark) version 2.0, a second generation fracture analysis code, for verification and validity. NASCRAC was evaluated using a combination of comparisons to the literature, closed-form solutions, numerical analyses, and tests. Several limitations and minor errors were detected. Additionally, a number of major flaws were discovered. These major flaws were generally due to application of a specific method or theory, not due to programming logic. Results are presented for the following program capabilities: K versus a, J versus a, crack opening area, life calculation due to fatigue crack growth, tolerable crack size, proof test logic, tearing instability, creep crack growth, crack transitioning, crack retardation due to overloads, and elastic-plastic stress redistribution. It is concluded that the code is an acceptable fracture tool for K solutions of simplified geometries, for a limited number of J and crack opening area solutions, and for fatigue crack propagation with the Paris equation and constant amplitude loads when the Paris equation is applicable.

  11. Orbit attitude processor. STS-1 bench program verification test plan (United States)

    Mcclain, C. R.


    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  12. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G


    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  13. SRS Software Verification Pre Operational and Startup Test

    Energy Technology Data Exchange (ETDEWEB)

    HILL, L.F.


    This document defines testing for the software used to control the Sodium Removal System (SRS). The testing is conducted off-line from the. physical plant by using a simulator built-in to the software. This provides verification of proper software operation prior to performing the operational acceptance tests with the actual plant hardware.

  14. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection. (United States)


    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...


    Verification testing of the Watts Premier M-Series M-15,000 RO Treatment System was conducted over a 31-day period from April 26, 2004, through May 26, 2004. This test was conducted at the Coachella Valley Water District (CVWD) Well 7802 in Thermal, California. The source water...

  16. Hubble Space Telescope-Space Shuttle interface dynamic verification test (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna


    A test program has been developed for the interface between the Space Shuttle Orbiter and the Hubble Space Telescope which couples a standard modal test for a simple suspended structure with a novel, 'interface verification' test. While the free-free modal test is used to verify the high loads generating structural modes due to the interaction of internal components of the structure with the rest of the structure, the interface verification test verifies the character of the high-loading generating modes in which the structure reacts against the booster interface. The novel method excites the structure at a single payload-booster interface DOF, while all other interfaces are left free to move.

  17. Grid Modernization Laboratory Consortium - Testing and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob; Kim, Tom; Ellis, Abraham


    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  18. Testing and Formal Verification of Logarithmic Function Design (United States)

    Agarwal, Sanjeev; Bhuria, Indu


    Logarithmic function has been designed on basis of multiplicative normalization and then its testing is been done using tetraMAX. It is observed that 7050 possible faults can be there in the design and tetraMAX ATPG can provide test coverage of 99.29%. Using design compiler .db file is generated which is used for functional verification of the design with respect to RTL design. Compare points are shown by cone views of the design.

  19. Clinical verification of a unilateral otolith test (United States)

    Wetzig, J.; Hofstetter-Degen, K.; Maurer, J.; von Baumgarten, R. J.

    In a previous study 13 we reported promising results for a new test to differentiate in vivo unilateral otolith functions. That study pointed to a need for further validation on known pathological cases. In this presentation we will detail the results gathered on a group of clinically verified vestibular defectives (verum) and a normal (control) group. The subjects in the verum group were former patients of the ENT clinic of the university hospital. These subjects had usually suffered from neurinoma of the VIIth cranial nerve or inner ear infections. All had required surgical intervention including removal of the vestibular system. The patients were contacted usually two or more years postoperatively. A group of students from the pre- and clinical phase of medical training served as control. Both groups were subjected to standardized clinical tests. These tests served to reconfirm the intra- or postoperative diagnosis of unilateral vestibular loss in the verum group. In the control group they had to establish the normalcy of the responses of the vestibular system. Both groups then underwent testing on our exccentric rotary chair in the manner described before 13. Preliminary results of the trials indicate that this test may indeed for the first time offer a chance to look at isolated otolith apparati in vivo.

  20. Dual Mode Inverter Control Test Verification

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, J.M.


    Permanent Magnet Motors with either sinusoidal back emf (permanent magnet synchronous motor [PMSM]) or trapezoidal back emf (brushless dc motor [BDCM]) do not have the ability to alter the air gap flux density (field weakening). Since the back emf increases with speed, the system must be designed to operate with the voltage obtained at its highest speed. Oak Ridge National Laboratory's (ORNL) Power Electronics and Electric Machinery Research Center (PEEMRC) has developed a dual mode inverter controller (DMIC) that overcomes this disadvantage. This report summarizes the results of tests to verify its operation. The standard PEEMRC 75 kW hard-switched inverter was modified to implement the field weakening procedure (silicon controlled rectifier enabled phase advance). A 49.5 hp motor rated at 2800 rpm was derated to a base of 400 rpm and 7.5 hp. The load developed by a Kahn Industries hydraulic dynamometer, was measured with a MCRT9-02TS Himmelstein and Company torque meter. At the base conditions a current of 212 amperes produced the 7.5 hp. Tests were run at 400, 1215, and 2424 rpm. In each run, the current was no greater than 214 amperes. The horsepower obtained in the three runs were 7.5, 9.3, and 8.12. These results verified the basic operation of the DMIC in producing a Constant Power Speed Ratios (CPSR) of six.

  1. Workgroup for Hydraulic laboratory Testing and Verification of Hydroacoustic Instrumentation (United States)

    Fulford, Janice M.; Armstrong, Brandy N.; Thibodeaux, Kirk G.


    An international workgroup was recently formed for hydraulic laboratory testing and verification of hydroacoustic instrumentation used for water velocity measurements. The activities of the workgroup have included one face to face meeting, conference calls and an inter-laboratory exchange of two acoustic meters among participating laboratories. Good agreement was found among four laboratories at higher tow speeds and poorer agreement at the lowest tow speed.


    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  3. Verification of MCNP and DANT/sys With the Analytic Benchmark Test Set

    Energy Technology Data Exchange (ETDEWEB)

    Parsons, D.K.; Sood, A.; Forster, R.A.; Little, R.C.


    The recently published analytic benchmark test set has been used to verify the multigroup option of MCNP and also the deterministic DANT/sys series of codes for criticality calculations. All seventy-five problems of the test set give values for K{sub eff} accurate to at least five significant digits. Flux ratios and flux shapes are also available for many of the problems. All seventy-five problems have been run by both the MCNP and DANT/sys codes and comparisons to K{sub eff} and flux shapes have been made. Results from this verification exercise are given below.

  4. The concept verification testing of materials science payloads (United States)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.


    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  5. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)


    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  6. Standard practices for verification of speed for material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 These practices cover procedures and requirements for the calibration and verification of testing machine speed by means of standard calibration devices. This practice is not intended to be complete purchase specifications for testing machines. 1.2 These practices apply to the verification of the speed application and measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, setting, etc. In all cases the buyer/owner/user must designate the speed-measuring system(s) to be verified. 1.3 These practices give guidance, recommendations, and examples, specific to electro-mechanical testing machines. The practice may also be used to verify actuator speed for hydraulic testing machines. 1.4 This standard cannot be used to verify cycle counting or frequency related to cyclic fatigue testing applications. 1.5 Since conversion factors are not required in this practice, either SI units (mm/min), or English [in/min], can be used as the standa...

  7. NASA's Evolutionary Xenon Thruster (NEXT) Component Verification Testing (United States)

    Herman, Daniel A.; Pinero, Luis R.; Sovey, James S.


    Component testing is a critical facet of the comprehensive thruster life validation strategy devised by the NASA s Evolutionary Xenon Thruster (NEXT) program. Component testing to-date has consisted of long-duration high voltage propellant isolator and high-cycle heater life validation testing. The high voltage propellant isolator, a heritage design, will be operated under different environmental condition in the NEXT ion thruster requiring verification testing. The life test of two NEXT isolators was initiated with comparable voltage and pressure conditions with a higher temperature than measured for the NEXT prototype-model thruster. To date the NEXT isolators have accumulated 18,300 h of operation. Measurements indicate a negligible increase in leakage current over the testing duration to date. NEXT 1/2 in. heaters, whose manufacturing and control processes have heritage, were selected for verification testing based upon the change in physical dimensions resulting in a higher operating voltage as well as potential differences in thermal environment. The heater fabrication processes, developed for the International Space Station (ISS) plasma contactor hollow cathode assembly, were utilized with modification of heater dimensions to accommodate a larger cathode. Cyclic testing of five 1/22 in. diameter heaters was initiated to validate these modified fabrication processes while retaining high reliability heaters. To date two of the heaters have been cycled to 10,000 cycles and suspended to preserve hardware. Three of the heaters have been cycled to failure giving a B10 life of 12,615 cycles, approximately 6,000 more cycles than the established qualification B10 life of the ISS plasma contactor heaters.

  8. Verification Challenges of Dynamic Testing of Space Flight Hardware (United States)

    Winnitoy, Susan


    The Six Degree-of-Freedom Dynamic Test System (SDTS) is a test facility at the National Aeronautics and Space Administration (NASA) Johnson Space Center in Houston, Texas for performing dynamic verification of space structures and hardware. Some examples of past and current tests include the verification of on-orbit robotic inspection systems, space vehicle assembly procedures and docking/berthing systems. The facility is able to integrate a dynamic simulation of on-orbit spacecraft mating or demating using flight-like mechanical interface hardware. A force moment sensor is utilized for input to the simulation during the contact phase, thus simulating the contact dynamics. While the verification of flight hardware presents many unique challenges, one particular area of interest is with respect to the use of external measurement systems to ensure accurate feedback of dynamic contact. There are many commercial off-the-shelf (COTS) measurement systems available on the market, and the test facility measurement systems have evolved over time to include two separate COTS systems. The first system incorporates infra-red sensing cameras, while the second system employs a laser interferometer to determine position and orientation data. The specific technical challenges with the measurement systems in a large dynamic environment include changing thermal and humidity levels, operational area and measurement volume, dynamic tracking, and data synchronization. The facility is located in an expansive high-bay area that is occasionally exposed to outside temperature when large retractable doors at each end of the building are opened. The laser interferometer system, in particular, is vulnerable to the environmental changes in the building. The operational area of the test facility itself is sizeable, ranging from seven meters wide and five meters deep to as much as seven meters high. Both facility measurement systems have desirable measurement volumes and the accuracies vary

  9. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo


    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  10. Flight testing vehicles for verification and validation of hypersonics technology (United States)

    Sacher, Peter W.


    Hypersonics technology has obtained renewed interest since various concepts for future completely reusable Space Transportation Systems (STS) using airbreathing propulsion for the parts of atmospheric flight have been proposed in different countries (e.g. US, CIS, Japan, France, Germany, and UK). To cover major developments in those countries, AGARD FDP has formed the Working Group 18 on 'Hypersonic Experimental and Computational Capabilities - Improvement and Validation'. Of major importance for the proof of feasibility for all these concepts is the definition of an overall convincing philosophy for a 'hypersonics technology development and verification concept' using ground simulation facilities (both experimental and numerical) and flight testing vehicles. Flying at hypersonic Mach numbers using airbreathing propulsion requires highly sophisticated design tools to provide reliable prediction of thrust minus aerodynamic drag to accelerate the vehicle during ascent. Using these design tools, existing uncertainties have to be minimized by a carefully performed code validation process. To a large degree the database required for this validation cannot be obtained on ground. In addition thermal loads due to hypersonic flow have to be predicted accurately by aerothermodynamic flow codes to provide the inputs needed to decide on materials and structures. Heat management for hypersonic flight vehicles is one of the key-issues for any kind of successful flight demonstration. This paper identifies and discusses the role of flight testing during the verification and validation process of advanced hypersonic technology needed for flight in the atmosphere with hypersonic Mach numbers using airbreathing propulsion systems both for weapons and space transportation systems.

  11. Battery Technology Life Verification Test Manual Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Jon P. Christophersen


    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  12. In-Space Engine (ISE-100) Development - Design Verification Test (United States)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad


    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  13. Normed algebras and the geometric series test

    Directory of Open Access Journals (Sweden)

    Robert Kantrowitz


    Full Text Available The purpose of this article is to survey a class of normed algebras that share many central features of Banach algebras, save for completeness. The likeness of these algebras to Banach algebras derives from the fact that the geometric series test is valid, whereas the lack of completeness points to the failure of the absolute convergence test for series in the algebra. Our main result is a compendium of conditions that are all equivalent to the validity of the geometric series test for commutative unital normed algebras. Several examples in the final section showcase some incomplete normed algebras for which the geometric series test is valid, and still others for which it is not.

  14. Multi-frac test series. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, R A; Warpinski, N R; Finley, S J; Shear, R C


    This paper describes a series of five full-scale tests performed to evaluate various multi-frac concepts. The tests were conducted at the Nevada Test Site in horizontal boreholes drilled in ash-fall tuff from a tunnel under 1300 ft of overburden.

  15. Active Thermal Control Experiments for LISA Ground Verification Testing (United States)

    Higuchi, Sei; DeBra, Daniel B.


    The primary mission goal of LISA is detecting gravitational waves. LISA uses laser metrology to measure the distance between proof masses in three identical spacecrafts. The total acceleration disturbance to each proof mass is required to be below 3 × 10-15 m/s2√Hz . Optical path length variations on each optical bench must be kept below 40 pm/√Hz over 1 Hz to 0.1 mHz. Thermal variations due to, for example, solar radiation or temperature gradients across the proof mass housing will distort the spacecraft causing changes in the mass attraction and sensor location. We have developed a thermal control system developed for the LISA gravitational reference sensor (GRS) ground verification testing which provides thermal stability better than 1 mK/√Hz to f control for the LISA spacecraft to compensate solar irradiation. Thermally stable environment is very demanded for LISA performance verification. In a lab environment specifications can be met with considerable amount of insulation and thermal mass. For spacecraft, the very limited thermal mass calls for an active control system which can meet disturbance rejection and stability requirements simultaneously in the presence of long time delay. A simple proportional plus integral control law presently provides approximately 1 mK/√Hz of thermal stability for over 80 hours. Continuing development of a model predictive feed-forward algorithm will extend performance to below 1 mK/√Hz at f < 1 mHz and lower.

  16. Verification Testing: Meet User Needs Figure of Merit (United States)

    Kelly, Bryan W.; Welch, Bryan W.


    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible

  17. Software Testing and Verification in Climate Model Development (United States)

    Clune, Thomas L.; Rood, RIchard B.


    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  18. Testing Equation Method Modification for Demanding Energy Measurements Verification

    Directory of Open Access Journals (Sweden)

    Elena Kochneva


    Full Text Available The paper is devoted to the mathematical approaches of the measurements received from Automatic Meter Reading Systems verification. Reliability of metering data can be improved by application of the new issue named Energy Flow Problem. The paper considers demanding energy measurements verification method based on verification expressions groups analysis. Bad data detection and estimates accuracy calculation is presented using the Automatic Meter Reading system data from the Russian power system fragment.

  19. HDTS 2017.1 Testing and Verification Document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012; Whiteside, 2017b). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect.

  20. Computer Science and Technology: Validation, Verification, and Testing for the Individual Programmer. (United States)

    Branstad, Martha A.; And Others

    Guidelines are given for program testing and verification to ensure quality software for the programmer working alone in a computing environment with limited resources. The emphasis is on verification as an integral part of the software development. Guidance includes developing and planning testing as well as the application of other verification…

  1. Falcon series data report: 1987 LNG vapor barrier verification field trials

    Energy Technology Data Exchange (ETDEWEB)

    Brown, T.C.; Cederwall, R.T.; Chan, S.T.; Ermak, D.L.; Koopman, R.P.; Lamson, K.C.; McClure, J.W.; Morris, L.K.


    A series of five Liquefied Natural Gas Spills up to 66 m{sup 3} in volume were performed on water within a vapor barrier structure at Frenchman Flat on the Nevada Test Site as a part of a joint government/industry study. This data report presents a description of the tests, the test apparatus, the instrumentation, the meteorological conditions, and the data from the tests. 16 refs., 27 figs., 8 tabs.

  2. Patch testing with Indian standard series

    Directory of Open Access Journals (Sweden)

    Narendra G


    Full Text Available Hundred patients (61 males, 39 females suspected to have allergic contact dermatitis were patch tested with Indian standard series (ISS. Forty four showed one or more positive reactions. The frequent sensitizers observed were nickel sulphate-12 (15%, potassium dichromate-11 (13.75%, cobalt chloride and colophony-7 (8.75% each, fragrance mix and thiuram mix-6 (7.5% each. The ISS differs from the European Standard Series by inclusion of propylene glycol, nitrofurazone, gentamicin, chlorocresol, PEG-400 and ethylenediamine chloride where assesquiterpene lactone mix and primin allergens are excluded.

  3. HDTS 2017.0 Testing and verification document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, Tad S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assur- ance documents including (Foley and Powell, 2010; Dixon, 2012). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the eld system performs within speci cations. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect. These tests con rm HDTS version 2017.0 performs according to its speci cations and documentation and that its performance meets the needs of its users at the Savannah River Site.

  4. MELDI2 Do No Harm Test Series (United States)

    Swanson, G. T.; Santos, J. A.; White, T. R.; Bruce, W. E.; Kuhl, C. A.; Wright, H. S.


    Mars 2020 will fly the Mars Entry, Descent, and Landing Instrumentation II (MEDLI2) sensor suite consisting of a total of seventeen instrumented thermal sensor plugs, eight pressure transducers, two heat flux sensors, and one radiometer embedded in the thermal protection system (TPS). Of the MEDLI2 instrumentation, eleven instrumented thermal plugs and seven pressure transducers will be installed on the heatshield of the Mars 2020 vehicle while the rest will be installed on the backshell. The goal of the MEDLI2 instrumentation is to directly inform the large performance uncertainties that contribute to the design and validation of a Mars entry system. A better understanding of the entry environment and TPS performance could lead to reduced design margins enabling a greater payload mass-fraction and smaller landing ellipses. To prove that the MEDLI2 system will not degrade the performance of the Mars 2020 TPS, an Aerothermal Do No Harm (DNH) test series was designed and conducted. Like Mars 2020's predecessor, Mars Science Laboratory (MSL), the heatshield material will be Phenolic Impregnated Carbon Ablator (PICA); the Mars 2020 entry conditions are enveloped by the MSL design environments, therefore the development and qualification testing performed during MEDLI is sufficient to show that the similar MEDLI2 heatshield instrumentation will not degrade PICA performance. However, given that MEDLI did not include any backshell instrumentation, the MEDLI2 team was required to design and execute a DNH test series utilizing the backshell TPS material (SLA-561V) with the intended flight sensor suite. To meet the requirements handed down from Mars 2020, the MEDLI2 DNH test series emphasized the interaction between the MEDLI2 sensors and sensing locations with the surrounding backshell TPS and substrucutre. These interactions were characterized by performing environmental testing of four 12" by 12" test panels, which mimicked the construction of the backshell TPS and the

  5. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  6. A standardized framework for the validation and verification of clinical molecular genetic tests.

    NARCIS (Netherlands)

    Mattocks, C.J.; Morris, M.A.; Matthijs, G.; Swinnen, E.; Corveleyn, A.; Dequeker, E.; Muller, C.R.; Pratt, V.; Wallace, A.


    The validation and verification of laboratory methods and procedures before their use in clinical testing is essential for providing a safe and useful service to clinicians and patients. This paper outlines the principles of validation and verification in the context of clinical human molecular

  7. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory


    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  8. A standardized framework for the validation and verification of clinical molecular genetic tests. (United States)

    Mattocks, Christopher J; Morris, Michael A; Matthijs, Gert; Swinnen, Elfriede; Corveleyn, Anniek; Dequeker, Els; Müller, Clemens R; Pratt, Victoria; Wallace, Andrew


    The validation and verification of laboratory methods and procedures before their use in clinical testing is essential for providing a safe and useful service to clinicians and patients. This paper outlines the principles of validation and verification in the context of clinical human molecular genetic testing. We describe implementation processes, types of tests and their key validation components, and suggest some relevant statistical approaches that can be used by individual laboratories to ensure that tests are conducted to defined standards.

  9. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  10. The JPSS Ground Project Algorithm Verification, Test and Evaluation System (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.


    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  11. A robust method using propensity score stratification for correcting verification bias for binary tests. (United States)

    He, Hua; McDermott, Michael P


    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified.

  12. Verification Test of Automated Robotic Assembly of Space Truss Structures (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.


    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  13. Developing Reading and Listening Comprehension Tests Based on the Sentence Verification Technique (SVT). (United States)

    Royer, James M.


    Describes a team-based approach for creating Sentence Verification Technique (SVT) tests, a development procedure that allows teachers and other school personnel to develop comprehension tests from curriculum materials in use in their schools. Finds that if tests are based on materials that are appropriate for the population to be tested, the…

  14. Towards a Theory for Integration of Mathematical Verification and Empirical Testing (United States)

    Lowry, Michael; Boyd, Mark; Kulkarni, Deepak


    From the viewpoint of a project manager responsible for the V&V (verification and validation) of a software system, mathematical verification techniques provide a possibly useful orthogonal dimension to otherwise standard empirical testing. However, the value they add to an empirical testing regime both in terms of coverage and in fault detection has been difficult to quantify. Furthermore, potential cost savings from replacing testing with mathematical verification techniques cannot be realized until the tradeoffs and synergies can be formulated. Integration of formal verification with empirical testing is also difficult because the idealized view of mathematical verification providing a correctness proof with total coverage is unrealistic and does not reflect the limitations imposed by computational complexity of mathematical techniques. This paper first describes a framework based on software reliability and formalized fault models for a theory of software design fault detection - and hence the utility of various tools for debugging. It then describes a utility model for integrating mathematical and empirical techniques with respect to fault detection and coverage analysis. It then considers the optimal combination of black-box testing, white-box (structural) testing, and formal methods in V&V of a software system. Using case studies from NASA software systems, it then demonstrates how this utility model can be used in practice.

  15. How to deal with double partial verification when evaluating two index tests in relation to a reference test?

    NARCIS (Netherlands)

    van Geloven, Nan; Brooze, Kimiko A.; Opmeer, Brent C.; Mol, Ben Willem; Zwinderman, Aeilko H.


    Research into the diagnostic accuracy of clinical tests is often hampered by single or double partial verification mechanisms, that is, not all patients have their disease status verified by a reference test, neither do all patients receive all tests under evaluation (index tests). We show methods

  16. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman


    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  17. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  18. 78 FR 33132 - Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... (United States)


    ... COMMISSION Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test Reactors.'' This guide... plate-type uranium-aluminum fuel elements used in research and test reactors (RTRs). ADDRESSES: Please...

  19. Comparison of intensities and rest periods for VO2max verification testing procedures. (United States)

    Nolan, P B; Beaven, M L; Dalleck, L


    We sought to determine the incidence of 'true' VO2max confirmation with the verification procedure across different protocols. 12 active participants (men n=6, women n=6) performed in random order 4 different maximal graded exercises tests (GXT) and verification bout protocols on 4 separate days. Conditions for the rest period and verification bout intensity were: A - 105% intensity, 20 min rest; B - 105% intensity, 60 min rest; C - 115% intensity, 20 min rest; D - 115% intensity, 60 min rest. VO2max confirmation (difference between peak VO2 GXT and verification trialVO2max confirmation across all exercise test conditions (intensity effect within recovery 20 min (χ(2) (1)=4.800, pVO2max confirmation with different rest periods. We recommend the use of 105% of the maximal GXT workload and 20 min rest periods when using verification trials to confirm VO2max in normally active populations. © Georg Thieme Verlag KG Stuttgart · New York.

  20. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols (United States)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio


    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  1. Verification test calculations for the Source Term Code Package

    Energy Technology Data Exchange (ETDEWEB)

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L


    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  2. Validation of Test Methods for Air Leak Rate Verification of Spaceflight Hardware (United States)

    Oravec, Heather Ann; Daniels, Christopher C.; Mather, Janice L.


    As deep space exploration continues to be the goal of NASAs human spaceflight program, verification of the performance of spaceflight hardware becomes increasingly critical. Suitable test methods for verifying the leak rate of sealing systems are identified in program qualification testing requirements. One acceptable method for verifying the air leak rate of gas pressure seals is the tracer gas leak detector method. In this method, a tracer gas (commonly helium) leaks past the test seal and is transported to the leak detector where the leak rate is quantified. To predict the air leak rate, a conversion factor of helium-to-air is applied depending on the magnitude of the helium flow rate. The conversion factor is based on either the molecular mass ratio or the ratio of the dynamic viscosities. The current work was aimed at validating this approach for permeation-level leak rates using a series of tests with a silicone elastomer O-ring. An established pressure decay method with constant differential pressure was used to evaluate both the air and helium leak rates of the O-ring under similar temperature and pressure conditions. The results from the pressure decay tests showed, for the elastomer O-ring, that neither the molecular flow nor the viscous flow helium-to-air conversion factors were applicable. Leak rate tests were also performed using nitrogen and argon as the test gas. Molecular mass and viscosity based helium-to-test gas conversion factors were applied, but did not correctly predict the measured leak rates of either gas. To further this study, the effect of pressure boundary conditions was investigated. Often, pressure decay leak rate tests are performed at a differential pressure of 101.3 kPa with atmospheric pressure on the downstream side of the test seal. In space applications, the differential pressure is similar, but with vacuum as the downstream pressure. The same O-ring was tested at four unique differential pressures ranging from 34.5 to 137.9 k

  3. Dynamic Isotope Power System: technology verification phase. Test plan. 79-KIPS-6

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, G.D.


    The objective of this document is to outline the test plan for the KIPS Technology Verification Program. This test plan is inclusive of component simulating (rig) testing, component testing and system testing. Rig testing will prove concept feasibility, measure basic performance and to develop the hardware necessary prior to initiation of GDS component part manufacture. Component testing will measure basic performance and verify component integrity prior to GDS assembly. The GDS system testing will: simulate the flight system operation; determine the life limiting components; measure performance and relate to potential system lifetime; demonstrate 18+% DC generating efficiency; and perform a 5000 h endurance test with final configuration hardware.

  4. Effect of verification bias on the sensitivity of fecal occult blood testing: a meta-analysis. (United States)

    Rosman, Alan S; Korsten, Mark A


    There is controversy regarding the sensitivity of fecal occult blood tests (FOBT) for detecting colorectal cancer. Many of the published studies failed to correct for verification bias which may have increased the sensitivity. A meta-analysis of published studies evaluating the sensitivity and specificity of chemical-based FOBT for colorectal cancer was performed. Studies were included if both cancer and control subjects underwent confirmatory testing. We also included studies that attempted to correct for verification bias by either performing colonoscopy on all subjects regardless of the FOBT result or by using longitudinal follow-up. We then compared the sensitivity, specificity, and other diagnostic characteristics of the studies that attempted to correct for verification (n=10) vs. those that did not correct for this bias (n=19). The pooled sensitivity of guaiac-based FOBT for colorectal cancer of studies without verification bias was significantly lower than those studies with this bias [0.36 (95% CI 0.25-0.47) vs. 0.70 (95% CI 0.60-0.80), p=0.001]. The pooled specificity of the studies without verification bias was higher [0.96 (95% CI 0.94-0.97) vs. 0.88 (95% CI 0.84-0.91), p<0.005]. There was no significant difference in the area under the summary receiver operating characteristic curves. More sensitive chemical-based FOBT methods (e.g., Hemoccult® SENSA®) had a higher sensitivity but a lower specificity than standard guaiac methods. The sensitivity of guaiac-based FOBT for colorectal cancer has been overestimated as a result of verification bias. This test may not be sensitive enough to serve as an effective screening option for colorectal cancer.

  5. Verification Testing to Confirm VO2max in Altitude-Residing, Endurance-Trained Runners. (United States)

    Weatherwax, R M; Richardson, T B; Beltz, N M; Nolan, P B; Dalleck, L


    We sought to explore the utility of the verification trial to confirm individual attainment of 'true' VO2max in altitude-residing, endurance-trained runners during treadmill exercise. 24 elite endurance-trained men and women runners (age=21.5±3.3 yr, ht=174.8±9.3 cm, body mass=60.5±6.7 kg, PR 800 m 127.5±13.1 s) completed a graded exercise test (GXT) trial (VO2max=60.0±5.8 mL·kg(-1)·min(-1)), and returned 20 min after incremental exercise to complete a verification trial (VO2max=59.6±5.7 mL·kg(-1)·min(-1)) of constant load, supramaximal exercise. The incidence of 'true' VO2max confirmation using the verification trial was 24/24 (100%) with all participants revealing differences in VO2max≤3% (the technical error of our equipment) between the GXT and verification trials. These findings support use of the verification trial to confirm VO2max attainment in altitude-residing, endurance-trained runners. © Georg Thieme Verlag KG Stuttgart · New York.

  6. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen... (United States)


    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with Microorganisms... § 381.94 Contamination with Microorganisms; process control verification criteria and testing; pathogen... maintaining process controls sufficient to prevent fecal contamination. FSIS shall take further action as...

  7. Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)

    DEFF Research Database (Denmark)


    This report documents the program and the outcomes of Dagstuhl Seminar 13091 “Analysis, Test and Verification in The Presence of Variability”. The seminar had the goal of consolidating and stimulating research on analysis of software models with variability, enabling the design of variability-awa...

  8. 40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements. (United States)


    ... ENGINES (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification testing requirements. 86.1845-01 Section 86.1845-01 Protection of Environment ENVIRONMENTAL PROTECTION...

  9. 40 CFR 86.1845-04 - Manufacturer in-use verification testing requirements. (United States)


    ... ENGINES (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification testing requirements. 86.1845-04 Section 86.1845-04 Protection of Environment ENVIRONMENTAL PROTECTION...

  10. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder


    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  11. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder


    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded...... against the plate using dead-weights. The block has two holders for test specimens, which form line contacts with the plate. A force transducer is used to measure the frictional force between the block and the plate. During verification of the test rig unwanted ripples on the signal recorded from...... the force transducer were discovered. An identification process is undertaken in order to find the source of this disturbance and to reduce the effect as much as possible. Second a reproducibility test is conducted to check the reliability of the test rig. The outcome of this work is a verified test rig...

  12. RELAP5-3D Restart and Backup Verification Testing

    Energy Technology Data Exchange (ETDEWEB)

    Mesina, George L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Existing testing methodology for RELAP5-3D employs a set of test cases collected over two decades to test a variety of code features and run on a Linux or Windows platform. However, this set has numerous deficiencies in terms of code coverage, detail of comparison, running time, and testing fidelity of RELAP5-3D restart and backup capabilities. The test suite covers less than three quarters of the lines of code in the relap directory and just over half those in the environmental library. Even in terms of code features, many are not covered. Moreover, the test set runs many problems long past the point necessary to test the relevant features. It requires standard problems to run to completion. This is unnecessary for features can be tested in a short-running problem. For example, many trips and controls can be tested in the first few time steps, as can a number of fluid flow options. The testing system is also inaccurate. For the past decade, the diffem script has been the primary tool for checking that printouts from two different RELAP5-3D executables agree. This tool compares two output files to verify that all characters are the same except for those relating to date, time and a few other excluded items. The variable values printed on the output file are accurate to no more than eight decimal places. Therefore, calculations with errors in decimal places beyond those printed remain undetected. Finally, fidelity of restart is not tested except in the PVM sub-suite and backup is not specifically tested at all. When a restart is made from any midway point of the base-case transient, the restart must produce the same values. When a backup condition occurs, the code repeats advancements with the same time step. A perfect backup can be tested by forcing RELAP5 to perform a backup by falsely setting a backup condition flag at a user-specified-time. Comparison of the calculations of that run and those produced by the same input w/o the spurious condition should be

  13. DKIST enclosure modeling and verification during factory assembly and testing (United States)

    Larrakoetxea, Ibon; McBride, William; Marshall, Heather K.; Murga, Gaizka


    The Daniel K. Inouye Solar Telescope (DKIST, formerly the Advanced Technology Solar Telescope, ATST) is unique as, apart from protecting the telescope and its instrumentation from the weather, it holds the entrance aperture stop and is required to position it with millimeter-level accuracy. The compliance of the Enclosure design with the requirements, as of Final Design Review in January 2012, was supported by mathematical models and other analyses which included structural and mechanical analyses (FEA), control models, ventilation analysis (CFD), thermal models, reliability analysis, etc. During the Enclosure Factory Assembly and Testing the compliance with the requirements has been verified using the real hardware and the models created during the design phase have been revisited. The tests performed during shutter mechanism subsystem (crawler test stand) functional and endurance testing (completed summer 2013) and two comprehensive system-level factory acceptance testing campaigns (FAT#1 in December 2013 and FAT#2 in March 2014) included functional and performance tests on all mechanisms, off-normal mode tests, mechanism wobble tests, creation of the Enclosure pointing map, control system tests, and vibration tests. The comparison of the assumptions used during the design phase with the properties measured during the test campaign provides an interesting reference for future projects.

  14. Verification of test battery of motoric assumptions for tennis


    Křelina, Vladimír


    This thesis focuses on testing the motoric assumptions of junior category tennis players in certain sport games. The aim of this thesis is to compare the results of the motoric test regarding to three tennis players of various performance levels in chosen sport games. Thus define the substantive significance and specificity of each test towards tennis. The assumptions in the theoretical part are based on my Bachelor thesis. In said thesis I am dealing with the characteristics of tennis, the s...

  15. Small-scale fixed wing airplane software verification flight test (United States)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  16. Testing for the Equality of Integration Orders of Multiple Series

    Directory of Open Access Journals (Sweden)

    Man Wang


    Full Text Available Testing for the equality of integration orders is an important topic in time series analysis because it constitutes an essential step in testing for (fractional cointegration in the bivariate case. For the multivariate case, there are several versions of cointegration, and the version given in Robinson and Yajima (2002 has received much attention. In this definition, a time series vector is partitioned into several sub-vectors, and the elements in each sub-vector have the same integration order. Furthermore, this time series vector is said to be cointegrated if there exists a cointegration in any of the sub-vectors. Under such a circumstance, testing for the equality of integration orders constitutes an important problem. However, for multivariate fractionally integrated series, most tests focus on stationary and invertible series and become invalid under the presence of cointegration. Hualde (2013 overcomes these difficulties with a residual-based test for a bivariate time series. For the multivariate case, one possible extension of this test involves testing for an array of bivariate series, which becomes computationally challenging as the dimension of the time series increases. In this paper, a one-step residual-based test is proposed to deal with the multivariate case that overcomes the computational issue. Under certain regularity conditions, the test statistic has an asymptotic standard normal distribution under the null hypothesis of equal integration orders and diverges to infinity under the alternative. As reported in a Monte Carlo experiment, the proposed test possesses satisfactory sizes and powers.

  17. TQAP for Verification of Qualitative Lead Test Kits (United States)

    There are lead-based paint test kits available to help home owners and contractors identify lead-based paint hazards before any Renovation, Repair, and Painting (RRP) activities take place so that proper health and safety meaures can be enacted. However, many of these test kits ...

  18. Warm Water Oxidation Verification - Scoping and Stirred Reactor Tests

    Energy Technology Data Exchange (ETDEWEB)

    Braley, Jenifer C.; Sinkov, Sergey I.; Delegard, Calvin H.; Schmidt, Andrew J.


    Scoping tests to evaluate the effects of agitation and pH adjustment on simulant sludge agglomeration and uranium metal oxidation at {approx}95 C were performed under Test Instructions(a,b) and as per sections 5.1 and 5.2 of this Test Plan prepared by AREVA. (c) The thermal testing occurred during the week of October 4-9, 2010. The results are reported here. For this testing, two uranium-containing simulant sludge types were evaluated: (1) a full uranium-containing K West (KW) container sludge simulant consisting of nine predominant sludge components; (2) a 50:50 uranium-mole basis mixture of uraninite [U(IV)] and metaschoepite [U(VI)]. This scoping study was conducted in support of the Sludge Treatment Project (STP) Phase 2 technology evaluation for the treatment and packaging of K-Basin sludge. The STP is managed by CH2M Hill Plateau Remediation Company (CHPRC) for the U.S. Department of Energy. Warm water ({approx}95 C) oxidation of sludge, followed by immobilization, has been proposed by AREVA and is one of the alternative flowsheets being considered to convert uranium metal to UO{sub 2} and eliminate H{sub 2} generation during final sludge disposition. Preliminary assessments of warm water oxidation have been conducted, and several issues have been identified that can best be evaluated through laboratory testing. The scoping evaluation documented here was specifically focused on the issue of the potential formation of high strength sludge agglomerates at the proposed 95 C process operating temperature. Prior hydrothermal tests conducted at 185 C produced significant physiochemical changes to genuine sludge, including the formation of monolithic concretions/agglomerates that exhibited shear strengths in excess of 100 kPa (Delegard et al. 2007).


    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  20. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders (United States)

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  1. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4) (United States)

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  2. Shaking table test and verification of development of an ...

    Indian Academy of Sciences (India)

    ... semi-active hydraulic damper (ASHD) is converted to interaction element (IE) of active interaction control (AIC). Systemic equations of motion, control law and control rulers of this proposed new AIC are studied in this research. A full-scale multiple degrees of freedom shaking table is tested toverify the energy dissipation of ...

  3. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)


    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  4. Final tests and performances verification of the European ALMA antennas (United States)

    Marchiori, Gianpietro; Rampini, Francesco


    The Atacama Large Millimeter Array (ALMA) is under erection in Northern Chile. The array consists of a large number (up to 64) of 12 m diameter antennas and a number of smaller antennas, to be operated on the Chajnantor plateau at 5000 m altitude. The antennas will operate up to 950 GHz so that their mechanical performances, in terms of surface accuracy, pointing precision and dimensional stability, are very tight. The AEM consortium constituted by Thales Alenia Space France, Thales Alenia Space Italy, European Industrial Engineering (EIE GROUP), and MT Mechatronics is assembling and testing the 25 antennas. As of today, the first set of antennas have been delivered to ALMA for science. During the test phase with ESO and ALMA, the European antennas have shown excellent performances ensuring the specification requirements widely. The purpose of this paper is to present the different results obtained during the test campaign: surface accuracy, pointing error, fast motion capability and residual delay. Very important was also the test phases that led to the validation of the FE model showing that the antenna is working with a good margin than predicted at design level thanks also to the assembly and integration techniques.

  5. Verification and application of the Iosipescu shear test method (United States)

    Walrath, D. E.; Adams, D. F.


    Finite element models were used to study the effects of notch angle variations on the stress state within an Iosipescu shear test speciment. These analytical results were also studied to determine the feasibility of using strain gage rosettes and a modified extensometer to measure shear strains in this test specimen. Analytical results indicate that notch angle variations produced only small differences in simulated shear properties. Both strain gage rosettes and the modified extensometer were shown to be feasible shear strain transducers for the test method. The Iosipoescu shear test fixture was redesigned to incorporate several improvements. These improvements include accommodation of a 50 percent larger specimen for easier measurement of shear train, a clamping mechanism to relax strict tolerances on specimen width, and a self contained alignment tool for use during specimen installation. A set of in-plane and interlaminar shear properties were measured for three graphite fabric/epoxy composites of T300/934 composite material. The three weave patterns were Oxford, 5-harness satin, and 8-harness satin.

  6. Patch-testing with plastics and glues series allergens. (United States)

    Shmidt, Eugenia; Farmer, Sara A; Davis, Mark D P


    Few US studies have reported results of patch testing with plastics and glues. To report our institution's results of testing patients suspected of allergy to plastics and glues with a comprehensive plastics and glues series and to compare these results with previously published data. Retrospective review of results of patch-testing with plastics and glues allergens at our institution between 2000 and 2007. In total, 444 patients were patch-tested with up to 56 plastics and glues allergens in the specialized series and up to five plastics and glues allergens in a baseline series. Positive-reaction rates were compared to other patch testing reports. Of patients, 97 (22%) had irritant reactions, and 201 (45%) had at least one allergic reaction. Bis(2-dimethylaminoethyl) ether 1%, benzoyl peroxide 1%, epoxy resin, bisphenol F 0.25%, 2-hydroxyethyl methacrylate 2%, and 2-hydroxyethyl acrylate 0.1% had the highest allergy reaction rates. Testing with specialized series identified 193 patients with plastics and glues allergy, of whom 162 were not identified by testing with baseline series alone. For patients suspected of allergy to plastics and glues, patch-testing with specialized series of plastics and glues allergens is an important adjunct to patch-testing with baseline series.

  7. Interim report on verification and benchmark testing of the NUFT computer code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, K.H.; Nitao, J.J. [Lawrence Livermore National Lab., CA (United States); Kulshrestha, A. [Weiss Associates, Emeryville, CA (United States)


    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  8. [Implication of inverse-probability weighting method in the evaluation of diagnostic test with verification bias]. (United States)

    Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin


    To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.

  9. Modified porosity rate frost heave model and tests verification (United States)

    Ji, Zhi-qiang; Xu, Xue-yan


    To avoid the complexity of modeling frost heave from microscope, porosity rate function has been used in predication of frost heave phenomenon. The approach explored in this paper is based on frost heave tests and the concept of the segregated potential which has been widely accepted by researchers in order to find the proper form of the porosity rate function. In the frozen fringe the porosity rate function was derived: n•=Be(-aPe) (gradT)2 (1-n) , (Tstests were carried out to verify the model, and the comparison between test results and analog results shows that the modified model is efficient for the prediction of frost heave, and it can be used in engineering practice.

  10. Residual flexibility test method for verification of constrained structural models (United States)

    Admire, John R.; Tinker, Michael L.; Ivey, Edward W.


    A method is described for deriving constrained modes and frequencies from a reduced model based on a subset of the free-free modes plus the residual effects of neglected modes. The method involves a simple modification of the MacNeal and Rubin component mode representation to allow development of a verified constrained (fixed-base) structural model. Results for two spaceflight structures having translational boundary degrees of freedom show quick convergence of constrained modes using a measureable number of free-free modes plus the boundary partition of the residual flexibility matrix. This paper presents the free-free residual flexibility approach as an alternative test/analysis method when fixed-base testing proves impractical.

  11. Cryogenic Fluid Management Experiment (CFME) trunnion verification testing (United States)

    Bailey, W. J.; Fester, D. A.


    The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).

  12. Manual and automation testing and verification of TEQ [ECI PROPIRETRY (United States)

    Abhichandra, Ravi; Jasmine Pemeena Priyadarsini, M.


    The telecommunication industry has progressed from 1G to 4G and now 5G is gaining prominence. Given the pace of this abrupt transformation, technological obsolescence is becoming a serious issue to deal with. Adding to this fact is that the execution of each technology requires ample investment into network, infrastructure, development etc. As a result, the industry is becoming more dynamic and strategy oriented. It requires professionals who not only can understand technology but also can evaluate the same from a business perspective. The “Information Revolution” and the dramatic advances in telecommunications technology, which has made this possible, currently drive the global economy in large part. As wireless networks become more advanced and far-reaching, we are redefining the notion of connectivity and the possibilities of communications technology. In this paper I test and verify the optical cards and automate this test procedure by using a new in-house technology “TEQ” developed by ECI TELECOM which uses one the optical cards itself to pump traffic of 100gbps.

  13. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007) (United States)

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  14. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others


    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  15. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.


    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  16. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2 (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.


    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  17. Shear Ram Verification Test Protocol (VTP) Best Practices

    Energy Technology Data Exchange (ETDEWEB)

    Lindley, Roy A. [Argonne National Lab. (ANL), Argonne, IL (United States); Braun, Joseph C. [Argonne National Lab. (ANL), Argonne, IL (United States)


    A blowout preventer (BOP) is a critical component used on subsea oil and gas wells during drilling, completion, and workover operations on the U. S. outer continental shelf (OCS). The purpose of the BOP is to seal oil and gas wells, and in the case of an emergency well-control event, to prevent the uncontrolled release of hydrocarbons. One of the most important components of the BOP is the hydraulically operated blind shear ram (BSR) that shears drilling-related components, such as drill pipes, casings, tubings, and wire-related tools that may have been placed in the well. In addition to shearing these components, the BSR must form a seal to keep hydrocarbons within the well bore, even when under the highest well-fluid pressures expected. The purpose of this document is for Argonne National Laboratory (ANL) to provide an independent view, based on current regulations, and best practices for testing and confirming the operability and suitability of BSRs under realistic (or actual) well conditions.

  18. ATPD-2354 Revision 10 Verification Test, Disc Brake Version Only (16 NOV 06) Article Test of High Mobility Multipurpose Wheeled Vehicle (HMMWV-ECV) (United States)


    070169 ATPD-2354 REVISION 10 VERIFICATION TEST, DISC BRAKE VERSION ONLY (16 NOV 06) ARTICLE TEST OF HIGH MOBILITY MULTIPURPOSE WHEELED...Verification Test, Disc Brake Verson Only (16 NOV 06) Article Test of High Mobility Multipurpose Wheeled Vehicle (HMMWV-ECV) 5a. CONTRACT NUMBER...different characteristics critical to the proper field service of the brake pads and rotor combination, an assortment of tests was conducted to evaluate

  19. Testing for intracycle determinism in pseudoperiodic time series. (United States)

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A


    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  20. Exploring different attributes of source information for speaker verification with limited test data. (United States)

    Das, Rohan Kumar; Mahadeva Prasanna, S R


    This work explores mel power difference of spectrum in subband, residual mel frequency cepstral coefficient, and discrete cosine transform of the integrated linear prediction residual for speaker verification under limited test data conditions. These three source features are found to capture different attributes of source information, namely, periodicity, smoothed spectrum information, and shape of the glottal signal, respectively. On the NIST SRE 2003 database, the proposed combination of the three source features performs better [equal error rate (EER): 20.19%, decision cost function (DCF): 0.3759] than the mel frequency cepstral coefficient feature (EER: 22.31%, DCF: 0.4128) for 2 s duration of test segments.

  1. Testing and verification of a novel single-channel IGBT driver circuit

    Directory of Open Access Journals (Sweden)

    Lukić Milan


    Full Text Available This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new designs. It is a part of new 20kW industrial-grade boost converter.

  2. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez


    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  3. Power Performance Verification of a Wind Farm Using the Friedman's Test. (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L


    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  4. Power Performance Verification of a Wind Farm Using the Friedman’s Test (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.


    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  5. Verification testing of the PKI collector at Sandia National Laboratories, Albuquerque, New Mexico (United States)

    Hauger, J. S.; Pond, S. L.


    Verification testing of a solar collector was undertaken prior to its operation as part of an industrial process heat plant at Capitol Concrete Products in Topeka, Kansas. Testing was performed at a control plant installed at Sandia National Laboratory, Albuquerque, New Mexico (SNLA). Early results show that plant performance is even better than anticipated and far in excess of test criteria. Overall plant efficiencies of 65 to 80 percent were typical during hours of good insolation. A number of flaws and imperfections were detected during operability testing, the most important being a problem in elevation drive alignment due to a manufacturing error. All problems were corrected as they occurred and the plant, with over 40 hours of operation, is currently continuing operability testing in a wholly-automatic mode.

  6. Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Al-Ayat, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walter, W. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treaty verification and nonproliferation.

  7. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element (United States)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide


    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  8. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  9. Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features (United States)

    Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed


    during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.

  10. Testing of the dual slab verification detector for attended measurements of the BN-350 dry storage casks

    Energy Technology Data Exchange (ETDEWEB)

    Santi, Peter A [Los Alamos National Laboratory; Browne, Michael C [Los Alamos National Laboratory; Williams, Richard B [Los Alamos National Laboratory; Parker, Robert F [Los Alamos National Laboratory


    The Dual Slab Verification Detector (DSVD) has been developed and built by Los Alamos National Laboratory in cooperation with the International Atomic Energy Agency (IAEA) as part of the dry storage safeguards system for the spent fuel from the BN-350 fast reactor. The detector consists of two rows of {sup 3}He tubes embedded in a slab of polyethylene which has been designed to be placed on the outer surface of the dry storage cask. The DSVD will be used to perform measurements of the neutron flux emanating from inside the dry storage cask at several locations around each cask to establish a neutron 'fingerprint' that is sensitive to the contents of the cask. The sensitivity of the fingerprinting technique to the removal of specific amount of nuclear material from the cask is determined by the characteristics of the detector that is used to perform the measurements, the characteristics of the spent fuel being measured, and systematic uncertainties that are associated with the dry storage scenario. MCNPX calculations of the BN-350 dry storage asks and layout have shown that the neutron fingerprint verification technique using measurements from the DSVD would be sensitive to both the amount and location of material that is present within an individual cask. To confirm the performance of the neutron fingerprint technique in verifying the presence of BN-350 spent fuel in dry storage, an initial series of measurements have been performed to test the performance and characteristics of the DSVD. Results of these measurements will be presented and compared with MCNPX results.

  11. Verification of FPGA-Signal using the test board which is applied to Safety-related controller

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Youn-Hu; Yoo, Kwanwoo; Lee, Myeongkyun; Yun, Donghwa [SOOSAN ENS, Seoul (Korea, Republic of)


    This article aims to provide the verification method for BGA-type FPGA of Programmable Logic Controller (PLC) developed as Safety Class. The logic of FPGA in the control device with Safety Class is the circuit to control overall logic of PLC. Saftety-related PLC must meet the international standard specifications. With this reason, we use V and V according to an international standard in order to secure high reliability and safety. By using this, we are supposed to proceed to a variety of verification courses for extra reliability and safety analysis. In order to have efficient verification of test results, we propose the test using the newly changed BGA socket which can resolve the problems of the conventional socket on this paper. The Verification of processes is divided into verification of Hardware and firmware. That processes are carried out in the unit testing and integration testing. The proposed test method is simple, the effect of cost reductions by batch process. In addition, it is advantageous to measure the signal from the Hi-speed-IC due to its short length of the pins and it was plated with the copper around it. Further, it also to prevent abrasion on the IC ball because it has no direct contact with the PCB. Therefore, it can be actually applied is to the BGA package test and we can easily verify logic as well as easily checking the operation of the designed data.

  12. 77 FR 16868 - Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test... (United States)


    ... COMMISSION Quality Verification for Plate-Type Uranium-Aluminum Fuel Elements for Use in Research and Test...-Type Uranium-Aluminum Fuel Elements for Use in Research and Test Reactors,'' is temporarily identified... verifying the quality of plate-type uranium-aluminum fuel elements used in research and test reactors (RTRs...

  13. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report (United States)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.


    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  14. NDEC: A NEA platform for nuclear data testing, verification and benchmarking

    Directory of Open Access Journals (Sweden)

    Díez C.J.


    Full Text Available The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.

  15. Test Method for Thermal Characterization of Li-Ion Cells and Verification of Cooling Concepts

    Directory of Open Access Journals (Sweden)

    Rouven Christen


    Full Text Available Temperature gradients, thermal cycling and temperatures outside the optimal operation range can have a significant influence on the reliability and lifetime of Li-ion battery cells. Therefore, it is essential for the developer of large-scale battery systems to know the thermal characteristics, such as heat source location, heat capacity and thermal conductivity, of a single cell in order to design appropriate cooling measures. This paper describes an advanced test facility, which allows not only an estimation of the thermal properties of a battery cell, but also the verification of proposed cooling strategies in operation. To do this, an active measuring unit consisting of a temperature and heat flux density sensor and a Peltier element was developed. These temperature/heat flux sensing (THFS units are uniformly arranged around a battery cell with a spatial resolution of 25 mm. Consequently, the temperature or heat flux density can be controlled individually, thus forming regions with constant temperature (cooling or zero heat flux (insulation. This test setup covers the whole development loop from thermal characterization to the design and verification of the proposed cooling strategy.

  16. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour. (United States)

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A


    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  17. Providing an empirical basis for optimizing the verification and testing phases of software development (United States)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.


    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  18. Constrained structural dynamic model verification using free vehicle suspension testing methods (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna


    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  19. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)


    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  20. Multiple (Two) Met Bel 601 In Series Ultimate Vacuum Testing

    Energy Technology Data Exchange (ETDEWEB)

    Restivo, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    SRNL Environmental and Chemical Process Technology (E&CPT) was requested to perform testing of vacuum pumps per a verbal request from the Customer, SRNL Hydrogen Processing Technology. Tritium Operations is currently having difficulties procuring the Normetex™® Model 15 m3/hr (9 CFM) vacuum pump (formerly Normetex Pompes, now EumecaSARL). One possible solution proposed by Hydrogen Processing Technology personnel is to use two Senior Aerospace Metal Bellows MB-601 vacuum pumps piped with the heads in series, and the pumps in series (Figure 1 below). This memorandum documents the ultimate vacuum testing that was performed to determine if this concept was a viable alternate vacuum pump strategy. This testing dovetails with previous pump evaluations documented in references 1 and 2.

  1. Development of a test system for verification and validation of nuclear transport simulations

    Energy Technology Data Exchange (ETDEWEB)

    White, Morgan C [Los Alamos National Laboratory; Triplett, Brian S [GENERAL ELECTRIC; Anghaie, Samim [UNIV OF FL


    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  2. Testing frequency-domain causality in multivariate time series. (United States)

    Faes, Luca; Porta, Alberto; Nollo, Giandomenico


    We introduce a new hypothesis-testing framework, based on surrogate data generation, to assess in the frequency domain, the concept of causality among multivariate (MV) time series. The approach extends the traditional Fourier transform (FT) method for generating surrogate data in a MV process and adapts it to the specific issue of causality. It generates causal FT (CFT) surrogates with FT modulus taken from the original series, and FT phase taken from a set of series with causal interactions set to zero over the direction of interest and preserved over all other directions. Two different zero-setting procedures, acting on the parameters of a MV autoregressive (MVAR) model fitted on the original series, were used to test the null hypotheses of absence of direct causal influence (CFTd surrogates) and of full (direct and indirect) causal influence (CFTf surrogates), respectively. CFTf and CFTd surrogates were utilized in combination with the directed coherence (DC) and the partial DC (PDC) spectral causality estimators, respectively. Simulations reproducing different causality patterns in linear MVAR processes demonstrated the better accuracy of CFTf and CFTd surrogates with respect to traditional FT surrogates. Application on real MV biological data measured from healthy humans, i.e., heart period, arterial pressure, and respiration variability, as well as multichannel EEG signals, showed that CFT surrogates disclose causal patterns in accordance with expected cardiorespiratory and neurophysiological mechanisms.

  3. SPE5 Sub-Scale Test Series Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Vandersall, Kevin S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reeves, Robert V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); DeHaven, Martin R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Strickland, Shawn L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    A series of 2 SPE5 sub-scale tests were performed to experimentally confirm that a booster system designed and evaluated in prior tests would properly initiate the PBXN-110 case charge fill. To conduct the experiments, a canister was designed to contain the nominally 50 mm diameter booster tube with an outer fill of approximately 150 mm diameter by 150 mm in length. The canisters were filled with PBXN-110 at NAWS-China Lake and shipped back to LLNL for testing in the High Explosives Applications Facility (HEAF). Piezoelectric crystal pins were placed on the outside of the booster tube before filling, and a series of piezoelectric crystal pins along with Photonic Doppler Velocimetry (PDV) probes were placed on the outer surface of the canister to measure the relative timing and magnitude of the detonation. The 2 piezoelectric crystal pins integral to the booster design were also utilized along with a series of either piezoelectric crystal pins or piezoelectric polymer pads on the top of the canister or outside case that utilized direct contact, gaps, or different thicknesses of RTV cushions to obtain time of arrival data to evaluate the response in preparation for the large-scale SPE5 test. To further quantify the margin of the booster operation, the 1st test (SPE5SS1) was functioned with both detonators and the 2nd test (SPE5SS2) was functioned with only 1 detonator. A full detonation of the material was observed in both experiments as observed by the pin timing and PDV signals. The piezoelectric pads were found to provide a greater measured signal magnitude during the testing with an RTV layer present, and the improved response is due to the larger measurement surface area of the pad. This report will detail the experiment design, canister assembly for filling, final assembly, experiment firing, presentation of the diagnostic results, and a discussion of the results.

  4. Inverse transport for the verification of the Comprehensive Nuclear Test Ban Treaty

    Directory of Open Access Journals (Sweden)

    J.-P. Issartel


    Full Text Available An international monitoring system is being built as a verification tool for the Comprehensive Test Ban Treaty. Forty stations will measure on a worldwide daily basis the concentration of radioactive noble gases. The paper introduces, by handling preliminary real data, a new approach of backtracking for the identification of sources of passive tracers after positive measurements. When several measurements are available the ambiguity about possible sources is reduced significantly. The approach is validated against ETEX data. A distinction is made between adjoint and inverse transport shown to be, indeed, different though equivalent ideas. As an interesting side result it is shown that, in the passive tracer dispersion equation, the diffusion stemming from a time symmetric turbulence is necessarily a self-adjoint operator, a result easily verified for the usual gradient closure, but more general.

  5. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  6. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example. (United States)

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D


    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Design and testing of the Series III AMTEC cell

    Energy Technology Data Exchange (ETDEWEB)

    Mital, R.; Sievers, R.K.


    This paper describes the design and testing of the Series III (S3) Alkali Metal Thermal to Electric Converter (AMTEC) cell which is capable of high efficiency (15--25%) and high power density (100--150 W/kg). Compared to the Series 2 cell which is being developed primarily for space power systems, the Series III cell design provides a significantly higher beta{double_prime}-alumina solid electrolyte (BASE) tube packing density around the heat source thereby increasing cell power and minimizing heat loss. The prototype S3 cell will have 96 BASE tubes and is expected to produce about 150 We. In this cell design the BASE tube assemblies are mounted on a cylindrical tube support plate. The BASE tubes are arranged like spokes on a wheel. The inner cylinder, concentric to the tube support plate, is the hot side of the cell and the outer cylinder is the condenser. Since the prototype S3 cell will be the first of its kind, an engineering cell with same dimensions as the prototype but with 24 BASE tubes was built first. The purpose of this cell was to identify and resolve structural, thermal, manufacturing and sodium management issues before launching into the build of a complete 96 BASE tube cell. The engineering cell has been successfully built and tested. The data of the engineering cells have been used to calibrate the SINDA/FLUINT code to predict the prototype cell performance more accurately. The build of the prototype 96 BASE tube cells is now in progress. This paper presents the design and development of the prototype S3 cell. The fabrication and testing of the first S3 engineering cell is discussed next. Based on the test data of the engineering cell, the anticipated thermal performance of the prototype cells predicted by the calibrated SINDA model are also presented.

  8. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug


    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  9. Verification testing of the compression performance of the HEVC screen content coding extensions (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng


    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  10. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili


    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  11. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia


    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  12. The USP Performance Verification Test, Part II: collaborative study of USP's Lot P Prednisone Tablets. (United States)

    Glasgow, Maria; Dressman, Shawn; Brown, William; Foster, Thomas; Schuber, Stefan; Manning, Ronald G; Wahab, Samir Z; Williams, Roger L; Hauck, Walter W


    Periodic performance verification testing (PVT) is used by laboratories to assess and demonstrate proficiency and for other purposes as well. For dissolution, the PVT is specified in the US Pharmacopeia General Chapter Dissolution under the title Apparatus Suitability Test. For Apparatus 1 and 2, USP provides two reference standard tablets for this purpose. For each new lot of these reference standards, USP conducts a collaborative study. For new USP Lot P Prednisone Tablets, 28 collaborating laboratories provided data. The study was conducted with three sets of tablets: Lot O open label, Lot O blinded, and Lot P blinded. The blinded Lot O data were used for apparatus suitability testing. Acceptance limits were determined after dropping data due to failure of apparatus suitability, identification of data as unusual on control charts, or protocol violations. Results yielded acceptance criteria of (47, 82) for Apparatus 1 and (37, 70) for Apparatus 2. Results generally were similar for Lot P compared to results from Lot O except that the average percent dissolved for Lot P is greater than for Lot O with Apparatus 2.

  13. Evaluation of LLTR Series II test A-2 results. [Large Leak Test Rig

    Energy Technology Data Exchange (ETDEWEB)

    Whipple, J C; Shoopak, B F; Chen, K; Fan, C K; Odegaard, T K


    Series II Test A-2 employed a double-ended (DEG) tube rupture 122'' above the lower end of the LLTI shroud under typical evaporator startup conditions. The leak site was located 2'' below Spacer No. 4 at the same location as Test A-lb which employed nitrogen as the inert non-reactive injection fluid. The test yielded peak pressures of 375 psig in the leak site region and 485 psig at the upper tubesheet approximately 10 ms and 12 ms, respectively, after tube rupture. Higher peak temperatures (approx. 2200/sup 0/F) were measured in this test than during Series I sodium-water reaction testing (peak temperatures measured during Series were about 1900/sup 0/F maximum). These high peak temperatures occurred in Test A-2 long after the tube rupture (approx. 8 seconds) and did not contribute to the acoustic peak pressures produced in the first few milliseconds.

  14. Verification testing to confirm VO2max attainment in persons with spinal cord injury. (United States)

    Astorino, Todd A; Bediamol, Noelle; Cotoia, Sarah; Ines, Kenneth; Koeu, Nicolas; Menard, Natasha; Nyugen, Brianna; Olivo, Cassandra; Phillips, Gabrielle; Tirados, Ardreen; Cruz, Gabriela Velasco


    Maximal oxygen uptake (VO2max) is a widely used measure of cardiorespiratory fitness, aerobic function, and overall health risk. Although VO2max has been measured for almost 100 yr, no standardized criteria exist to verify VO2max attainment. Studies document that incidence of 'true' VO2max obtained from incremental exercise (INC) can be confirmed using a subsequent verification test (VER). In this study, we examined efficacy of VER in persons with spinal cord injury (SCI). Repeated measures, within-subjects study. University laboratory in San Diego, CA. Ten individuals (age and injury duration = 33.3 ± 10.5 yr and 6.8 ± 6.2 yr) with SCI and 10 able-bodied (AB) individuals (age = 24.1 ± 7.4 yr). Peak oxygen uptake (VO2peak) was determined during INC on an arm ergometer followed by VER at 105 percent of peak power output (% PPO). Gas exchange data, heart rate (HR), and blood lactate concentration (BLa) were measured during exercise. Across all participants, VO2peak was highly related between protocols (ICC = 0.98) and the mean difference was equal to 0.08 ± 0.11 L/min. Compared to INC, VO2peak from VER was not different in SCI (1.30 ± 0.45 L/min vs. 1.31 ± 0.43 L/min) but higher in AB (1.63 ± 0.40 L/min vs. 1.76 ± 0.40 L/min). Data show similar VO2peak between incremental and verification tests in SCI, suggesting that VER confirms VO2max attainment. However, in AB participants completing arm ergometry, VER is essential to validate appearance of 'true' VO2peak.

  15. Gas Generation from K East Basin Sludges - Series II Testing

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Samuel A.; Delegard, Calvin H.; Schmidt, Andrew J.; Sell, Rachel L.; Silvers, Kurt L.; Gano, Susan R.; Thornton, Brenda M.


    This report describes work to examine the gas generation behavior of actual K East (KE) Basin floor, pit and canister sludge. Mixed and unmixed and fractionated KE canister sludge were tested, along with floor and pit sludges from areas in the KE Basin not previously sampled. The first report in this series focused on gas generation from KE floor and canister sludge collected using a consolidated sampling technique. The third report will present results of gas generation testing of irradiated uranium fuel fragments with and without sludge addition. The path forward for management of the K Basin Sludge is to retrieve, ship, and store the sludge at T Plant until final processing at some future date. Gas generation will impact the designs and costs of systems associated with retrieval, transportation and storage of sludge.

  16. Electromagnetic compatibility (EMC) standard test chamber upgrade requirements for spacecraft design verification tests (United States)

    Dyer, Edward F.


    In view of the serious performance deficiencies inherent in conventional modular and welded shielding EMC test enclosures, in which multipath reflections and resonant standing waves can damage flight hardware during RF susceptibility tests, NASA-Goddard has undertaken the modification of a 20 x 24 ft modular-shielded enclosure through installation of steel panels to which ferrite tiles will be mounted with epoxy. The internally reflected RF energy will thereby be absorbed, and exterior power-line noise will be reduced. Isolation of power-line filters and control of 60-Hz ground connections will also be undertaken in the course of upgrading.

  17. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...


    The overall objective of the Environmental Testing and Verification Coatings and Coating Equipment Program is to verify pollution prevention and performance characteristics of coating technologies and make the results of the testing available to prospective coating technology use...

  19. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío


    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  20. Validation and Verification (V and V) Testing on Midscale Flame Resistant (FR) Test Method (United States)


    and M. Grady, “ High Intensity Thermal Testing of Protective Fabrics with CO2 Laser ”, STP1593 on the Tenth Symposium on Performance of Protective...transmitted heat flux through a wide range of fabrics using the CO2 laser [5] has indicated that the thickness of the air gap has a very strong effect...when alternate measurements of performance such as predicted depth of burn, transmitted fluence and Energy Transmission Factor (ETF) are used to assess

  1. Ames Research Center Mars/Pathfinder Heat Shield Design Verification ARC-JET Test (United States)

    Tran, Huy K.; Hui, Frank; Wercinski, Paul; Cartledge, Alan; Tauber, Mike; Tran, Duoc T.; Chen, Y. K.; Arnold, James O. (Technical Monitor)


    Design verification tests were performed on samples representing the aerobrake of the Mars/Pathfinder vehicle. The test specimens consisted of the SLA-561V ablator bonded to the honeycomb structure. The primary objective was to evaluate the ablation materials performance and to measure temperatures within the ablator, at the structural bondline and at the back sheet of the honeycomb structure. Other objectives were to evaluate the effect of ablative repair plug material treatment and voids in the heat shield. A total of 29 models were provided for testing in the Ames 60MW arc-jet facility. Of these, 23 models were flat-faced and six remaining models were curved edge ones, intended to simulate the conditions on the curved rim of the forebody where the maximum shear occurred. Eight sets of test conditions were used. The stagnation point heating rates varied from 47 to 240 W/cm2 and the stagnation pressures from 0.15 to 0.27 atm. (The maximum flight values are 132 W/cm2 and 0.25 atm) The majority of these runs were made at a nominal stagnation pressure of 0.25 atm. Two higher pressure runs were made to check the current (denser) ablation material for spallation, or other forms of thermal stress failure. Over 60% of the flatfaced models yielded good thermocouple data and all produced useful surface recession information. Of the five curved-edge models that were tested, only one gave good data; the remaining ones experienced model-holder failure. The test results can be summarized by noting that no failure of the ablative material was observed on any model. Also, the bondline temperature design limit of 250 C was never reached within an equivalent flight time despite a stagnation point heat load that exceeded the maximum flight value by up to 130%. At heating rates of over 200W/cm2 and stagnation pressures of 0.25 atm, or greater, the average surface recessions exceeded 0.5 cm on some models. The surface roughness increased dramatically at pressures above 0.25 atm and

  2. Absolute homogeneity test of Kelantan catchment precipitation series (United States)

    Ros, Faizah Che; Tosaka, Hiroyuki; Sasaki, Kenji; Sidek, Lariyah Mohd; Basri, Hidayah


    Along the Kelantan River in north east of Malaysia Peninsular, there are several areas often damaged by flood during north-east monsoon season every year. It is vital to predict the expected behavior of precipitation and river runoff for reducing flood damages of the area under rapid urbanization and future planning. Nevertheless, the accuracy and reliability of any hydrological and climate studies vary based on the quality of the data used. The factors causing variations on these data are the method of gauging and data collection, stations environment, station relocation and the reliability of the measurement tool affect the homogenous precipitation records. Hence in this study, homogeneity of long precipitation data series is checked via the absolute homogeneity test consisting of four methods namely Pettitt test, standard normal homogeneity test (SNHT), Buishand range test and Von Neumann ratio test. For homogeneity test, the annual rainfall amount from the daily precipitation records at stations located in Kelantan operated by Department of Irrigation and Drainage Malaysia were considered in this study. The missing values were completed using the correlation and regression and inverse distance method. The data network consists of 103 precipitation gauging stations where 31 points are inactive, 6 gauging stations had missing precipitation values more than five years in a row and 16 stations have records less than twenty years. So total of 50 stations gauging stations were evaluated in this analysis. With the application of the mentioned methods and further graphical analysis, inhomogeneity was detected at 4 stations and 46 stations are found to be homogeneous.

  3. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)


    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  4. Ecotoxicological test systems proceedings of a series of workshops

    Energy Technology Data Exchange (ETDEWEB)

    Hammons, A.S. (ed.)


    A series of six workshops was conducted by the Environmental Sciences Division, Oak Ridge National Laboratory, to identify laboratory methods and data evaluation techniques for predicting the environmental effects of chemical substances. Methods were evaluated for their potential for standardization and for use in the ecological hazard and risk assessment processes under the Toxic Substances Control Act. The workshops addressed assessment and policy requirements of multispecies toxicology test procedures, mathematical models useful in hazard and risk assessments, and methods for measuring effects of chemicals on terrestrial and aquatic population interactions and ecosystem properties. The workshops were primarily used as a mechanism to gather information about research in progress. This information was part of the data base used to prepare a critical review of laboratory methods for ecological toxicology.

  5. Some New Verification Test Problems for Multimaterial Diffusion on Meshes that are Non-Aligned with Material Boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Dawes, Alan Sidney [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Malone, Christopher M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shashkov, Mikhail Jurievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    In this report a number of new verification test problems for multimaterial diffusion will be shown. Using them we will show that homogenization of multimaterial cells in either Arbitrary Lagrangian Eulerian (ALE) or Eulerian simulations can lead to errors in the energy flow at the interfaces. Results will be presented that show that significant improvements and predictive capability can be gained by using either a surrogate supermesh, such as Thin Mesh in FLAG, or the emerging method based on Static Condensation.

  6. Science Library of Test Items. Volume Sixteen. Mastery Testing Program. Series 6. Tests M66-M91. (United States)

    New South Wales Dept. of Education, Sydney (Australia).

    As part of a series of tests to measure mastery of specific skills in the natural sciences, print masters of tests 66 through 91 are provided. Among the areas covered are: carbon compounds; evolution; map reading; genetics; energy; chemical formulae; electricity; graphs; metric measures; solubility; and physical separations. Many tests contain…

  7. Development and verification testing of automation and robotics for assembly of space structures (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.


    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  8. Raven's progressive matrices test: scale construction and verification of "Flynn effect"


    Lopetegui, María Susana; Neer, Rosa Haydée; Rossi Casé, Lilia Elba


    In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the si...

  9. Achievement of VO2max criteria during a continuous graded exercise test and a verification stage performed by college athletes. (United States)

    Mier, Constance M; Alexander, Ryan P; Mageean, Amanda L


    The purpose of this study was to determine the incidence of meeting specific VO2max criteria and to test the effectiveness of a VO2max verification stage in college athletes. Thirty-five subjects completed a continuous graded exercise test (GXT) to volitional exhaustion. The frequency of achieving various respiratory exchange ratio (RER) and age-predicted maximum heart rate (HRmax) criteria and a VO2 plateau within 2 and 2.2 ml·kg(-1)·min(-1) (VO2max plateau was 5 (≤2 ml·kg(-1)·min(-1)) and 7 (≤2.2 ml·kg(-1)·min(-1)), RER criteria 34 (≥1.05), 32 (≥1.10), and 24 (≥1.15), HRmax criteria, 35 (VO2max and HRmax did not differ between GXT and the verification stage (53.6 ± 5.6 vs. 55.5 ± 5.6 ml·kg(-1)·min(-1) and 187 ± 7 vs. 187 ± 6 b·min(-1)); however, the RER was lower during the verification stage (1.15 ± 0.06 vs. 1.07 ± 0.07, p = 0.004). Six subjects achieved a similar VO2 (within 2.2 ml·kg(-1)·min(-1)), whereas 4 achieved a higher VO2 compared with the GXT. These data demonstrate that a continuous GXT limits the college athlete's ability to achieve VO2max plateau and certain RER and HR criteria. The use of a verification stage increases the frequency of VO2max achievement and may be an effective method to improve the accuracy of VO2max measurements in college athletes.

  10. Formal verification and testing: An integrated approach to validating Ada programs (United States)

    Cohen, Norman H.


    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  11. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen... (United States)


    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with microorganisms... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... controls sufficient to prevent fecal contamination. FSIS shall take further action as appropriate to ensure...

  12. Environmental assessment of general-purpose heat source safety verification testing

    Energy Technology Data Exchange (ETDEWEB)



    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE`s mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned.

  13. Baghouse filtration products verification

    Energy Technology Data Exchange (ETDEWEB)

    Mycock, J.C.; Turner, J.H.; VanOsdell, D.W.; Farmer, J.R.; Brna, T.G.


    The paper introduces EPA`s Air Pollution Control Technology Verification (APCT) program and then focuses on the immediate objective of the program: laboratory performance verification of cleanable filter media intended for the control of fine particulate emissions. Data collected during the laboratory verification testing, which simulates operation in full-scale fabric filters, will be used to show expected performance for collection of particles {le} 2.5 micrometers in diameter.

  14. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)


    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  15. Patch testing with hair cosmetic series in Europe

    DEFF Research Database (Denmark)

    Uter, Wolfgang; Bensefa-Colas, Lynda; Frosch, Peter


    of the present review was to collect information on the current practice of using 'hair cosmetic series', and discuss this against the background of evidence concerning consumer/professional exposure and regulatory aspects to finally derive a recommendation for a 'European hair cosmetic series'. The methods...... (Annex II of the Cosmetics Regulation). An up-to-date 'European hair cosmetics series', as recommended in the present article, should (i) include broadly used and/or potent contact allergens, (ii) eliminate substances of only historical concern, and (iii) be continually updated as new evidence emerges....

  16. [Paracoagulation tests in laboratory verification of intravascular coagulation. Comparative clinical evaluation of the SDPS test and of the ethanol test with serial dilution (SD ethanol test)]. (United States)

    Doni, A; De Simonis, S E; Pasquale, G


    the patients. In this second series of 314 patients the two tests are both positive in 138 cases (43,9%); both non-positive in 33 cases (10,5%); S.D.P.S. test was positive alone in 108 cases (34,3%); S.D.E.G. test was positive alone in 35 cases (11,3%). As we can see in this second series S.D.P.S. test remains more sensitive showing positivities in 246 cases (78,2%) whereas S.D.E.G. test is positive in 173 cases (55,2%). In conclusion the application of serial diluitions also to Ethanol Gelation test increase its sensitivity (from 29,5% to 55,2%), but it does not reach S.D.P.S. test's one. S.D.E.G. test may be really useful in clinics because it gives informations within 10 min but its non-positivity must be supported by the results of S.D.P.S. test which on the contrary gives sure informations only after 24 hours.

  17. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Phyllis C.


    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  18. Experimental verification and stability state space analysis of CLL-T series parallel resonant converter with fuzzy controller (United States)

    Nagarajan, Chinnadurai; Madheswaran, Muthusamy


    This paper presents a closed loop CLL-T (capacitor inductor inductor) series parallel resonant converter (SPRC) has been simulated and the performance is analyzed. A three element CLL-T SPRC working under load independent operation (voltage type and current type load) is presented in this paper. The stability and AC analysis of CLL-T SPRC has been developed using state space technique and the regulation of output voltage is done by using Fuzzy controller. The simulation study indicates the superiority of fuzzy control over the conventional control methods. The proposed approach is expected to provide better voltage regulation for dynamic load conditions. A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the CLL-T SPRC are compared from the simulation studies.

  19. Series paralelas al Rorschach: validación en nuestro medio de la serie de Parisi-Pes y del Test de Zulliger Rorschach parallel series: local validation of the Parisi-Pes series and the Z Test

    Directory of Open Access Journals (Sweden)

    Ana María Núñez


    Full Text Available El presente artículo surge de la necesidad de validar, en nuestro medio, series paralelas al Test de Rorschach con el fin de poder reemplazarlo en aquellos casos en que se lo requiera. El incremento de la difusión de esta técnica, fuera del ámbito de la comunidad psicológica, puede derivar en un efecto de aprendizaje que dificulte el uso de la herramienta psicodiagnóstica. En esta publicación se realiza un recorrido a través de las diferentes series propuestas como paralelas al Test de Rorschach y se exponen los resultados de dos investigaciones: una de las cuales corresponde a la serie de Parisi-Pes, creada por la Escuela Romana de Rorschach, poco difundida en nuestro medio pero validada en uno con características socioculturales similares al nuestro (Proyecto UBACyT P039; y la otra, el Test de Zulliger, que se aplica con frecuencia en el ámbito laboral, en ambas versiones, individual y colectiva (Proyecto UBACyT P005.This article stems from the need to validate Rorschach parallel series at our social environment, in order to replace it when required. The increase in the dissemination of this technique, outside the psychological community, can lead to a learning effect which may prevent this psychodiagnostic tool from being used. This publication is a journey through the different Rorschach parallel series, and the results from two previous researches are being exposed: the first one of those, belongs to the Parisi-Pes series, created by the Roman Rorschach School, not much locally known but it had been validated in a similar social environment (Project UBACyT P039; the other one, the Z Test, is often used at Labor Psychology in both versions, individual and group administrations (Project UBACyT P005.

  20. Breaks in MODIS time series portend vegetation change: verification using long-term data in an arid grassland ecosystem. (United States)

    Browning, Dawn M; Maynard, Jonathan J; Karl, Jason W; Peters, Debra C


    Frequency and severity of extreme climatic events are forecast to increase in the 21st century. Predicting how managed ecosystems may respond to climatic extremes is intensified by uncertainty associated with knowing when, where, and how long effects of extreme events will be manifest in an ecosystem. In water-limited ecosystems with high inter-annual variability in rainfall, it is important to be able to distinguish responses that result from seasonal fluctuations in rainfall from long-term directional increases or decreases in precipitation. A tool that successfully distinguishes seasonal from directional biomass responses would allow land managers to make informed decisions about prioritizing mitigation strategies, allocating human resource monitoring efforts, and mobilizing resources to withstand extreme climatic events. We leveraged long-term observations (2000-2013) of quadrat-level plant biomass at multiple locations across a semiarid landscape in southern New Mexico to verify the use of Normalized Difference Vegetation Index (NDVI) time series derived from 250-m Moderate Resolution Imaging Spectroradiometer (MODIS) data as a proxy for changes in aboveground productivity. This period encompassed years of sustained drought (2000-2003) and record-breaking high rainfall (2006 and 2008) followed by subsequent drought years (2011 through 2013) that resulted in a restructuring of plant community composition in some locations. Our objective was to decompose vegetation patterns derived from MODIS NDVI over this period into contributions from (1) the long-term trend, (2) seasonal cycle, and (3) unexplained variance using the Breaks for Additive Season and Trend (BFAST) model. BFAST breakpoints in NDVI trend and seasonal components were verified with field-estimated biomass at 15 sites that differed in species richness, vegetation cover, and soil properties. We found that 34 of 45 breaks in NDVI trend reflected large changes in mean biomass and 16 of 19 seasonal

  1. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)


    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  2. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others


    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  3. Environmental Technology Verification: Baghouse filtration products--W.L. Gore & Associates L3650 filtration media (tested November--December 2009) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  4. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6277 Filtration Media (Tested March 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  5. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6262 Filtration Media (Tested March 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  6. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6282 Filtration Media (Tested March - April 2011) (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  7. Verification and standardization of blood cell counters for routine clinical laboratory tests. (United States)

    Verbrugge, Sue Ellen; Huisman, Albert


    The use of automated blood cell counters (automated hematology analyzers) for diagnostic purposes is inextricably linked to clinical laboratories. However, the need for uniformity among the various methods and parameters is increasing and standardization of the automated analyzers is therefore crucial. Standardization not only involves procedures based on reference methods but it also involves validation, verification, quality assurance, and quality control, and it includes the involvement of several participants. This article discusses the expert guidelines and provides an overview of issues involved in complete blood count parameter reference methods and standardization of reporting units. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results (United States)

    Burken, John J.; Larson, Richard R.


    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  9. Verification test for three WindCube WLS7 LiDARs at the Høvsøre test site

    DEFF Research Database (Denmark)

    Gottschall, Julia; Courtney, Michael

    The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7-0062, and ......-0062, and in a summary for units WLS7-0064 and WLS7-0066. The verification test covers the evaluation of measured mean wind speeds, wind directions and wind speed standard deviations. The data analysis is basically performed in terms of different kinds of regression analyses.......The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7...

  10. Verification of consumers' experiences and perceptions of genetic discrimination and its impact on utilization of genetic testing. (United States)

    Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret


    To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.

  11. Irradiation effects test series, test IE-5. Test results report. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Croucher, D. W.; Yackle, T. R.; Allison, C. M.; Ploger, S. A.


    Test IE-5, conducted in the Power Burst Facility at the Idaho National Engineering Laboratory, employed three 0.97-m long pressurized water reactor type fuel rods, fabricated from previously irradiated zircaloy-4 cladding and one similar rod fabricated from unirradiated cladding. The objectives of the test were to evaluate the influence of simulated fission products, cladding irradiation damage, and fuel rod internal pressure on pellet-cladding interaction during a power ramp and on fuel rod behavior during film boiling operation. The four rods were subjected to a preconditioning period, a power ramp to an average fuel rod peak power of 65 kW/m, and steady state operation for one hour at a coolant mass flux of 4880 kg/s-m/sup 2/ for each rod. After a flow reduction to 1800 kg/s-m/sup 2/, film boiling occurred on one rod. Additional flow reductions to 970 kg/s-m/sup 2/ produced film boiling on the three remaining fuel rods. Maximum time in film boiling was 80s. The rod having the highest initial internal pressure (8.3 MPa) failed 10s after the onset of film boiling. A second rod failed about 90s after reactor shutdown. The report contains a description of the experiment, the test conduct, test results, and results from the preliminary postirradiation examination. Calculations using a transient fuel rod behavior code are compared with the test results.

  12. Testing Non-Stationarity in Selected Macroeconomic Series from ...

    African Journals Online (AJOL)

    The study tested stationarity in a selected set of macroeconomic variables (some constructed) from Sudan over the period 1969 to 1998. Augmented Dickey Fuller tests were employed to test for presence of unit roots. The study found that unit roots existed in most variables, namely, private investment, public investment, real ...

  13. Evaluation of suspected cosmetic induced facial dermatoses with the use of Indian standard series and cosmetic series patch test. (United States)

    Rastogi, Madhur Kant; Gupta, Astha; Soodan, Puneet Singh; Mishra, Nitin; Gahalaut, Pratik


    Awareness about skin beauty or cosmetic elegance has received worldwide attention in the present day youth oriented society. Along with careful detailed history and thorough examination patch test is considered cornerstone in diagnosis of allergic contact dermatitis. Fifty patients suspected clinical diagnosis of contact facial dermatitis due to attended the Department of Dermatology, were included in a hospital based study. The patch test was applied on the upper back of using 32 allergens present in Indian cosmetic series and 20 known allergens in Indian standard battery series procured from Systopic Pharmaceutical Ltd, after applying the patch test, the patient was asked to come after 48h and 72h for reading the results of the patch test. Out of 50 patients there were 32 (64%) females (housewives 36%) patients and 18 (36%) male (farmers 12%). Itching was the most common presenting symptom in 39 patients (78%) least was hypopigmentation and pain in 2%. Forehead was the most common site of involvement in 25 patients (50%) least were cheeks in 15 patients (30%). Erythema was the commonest morphological presentation seen in 36 patients (72%). Hair dye was suspected in maximum number of patients that is 13 (26%). Most common antigen showing patch test positivity was paraphenylenediamine in nine patients (18%). There are significantly more chances of developing positive test reaction with Indian standard series compared to cosmetic series. (p=.0053 using Fischer Exact test). In India there is no legislation regarding labeling ingredients on cosmetics as in the western countries, so labelling of the contents of cosmetic products should be the main challenge in cosmetic dermatitis is to identify.

  14. Ceftriaxone Induced Hypersensitivity Reactions Following Intradermal Skin Test: Case Series

    Directory of Open Access Journals (Sweden)

    Sereen Rose Thomson


    Full Text Available The incidence of cephalosporin induced hypersensitivity reactions in non-penicillin allergic patients is about 1.7% and in penicillin allergic patients it is about 3-5%. Infact, cephalosporins are considered as the first choice in penicillin allergic patients who need antibiotic therapy intraoperatively. Prompt identification of patients with beta-lactam allergy would lead to an improved utilization of antibiotics and reduced occurrence of resistant strains. We hereby attempt to present a series of cases where ceftriaxone has been implicated in the manifestation of various hypersensitivity reactions. We have also tried to highlight some of the errors, risk factors and other drugs that precipitate a hypersensitivity reaction.

  15. Air Force Research Laboratory Test and Evaluation, Verification and Validation of Autonomous Systems Challenge Exploration (United States)


    S and T&E. The results from these methods can be recorded in a modular fashion, enabling compositional verification of autonomous subcomponents at...Behcet Acikmese  UTEXAS  Darryl Ahner  AFIT  Nick Armstrong‐Crews  MIT  Dionisio de Niz  SEI/CMU  Georgios Fainekos  ASU   Karen Feigh  GATECH  Naira...problem is the upfront design process and the  system itself.  • Modularization  needs to be pushed more and further.  How will things connect and work

  16. Mixed Portmanteau Test for Diagnostic Checking of Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand


    Full Text Available Model criticism is an important stage of model building and thus goodness of fit tests provides a set of tools for diagnostic checking of the fitted model. Several tests are suggested in literature for diagnostic checking. These tests use autocorrelation or partial autocorrelation in the residuals to criticize the adequacy of fitted model. The main idea underlying these portmanteau tests is to identify if there is any dependence structure which is yet unexplained by the fitted model. In this paper, we suggest mixed portmanteau tests based on autocorrelation and partial autocorrelation functions of the residuals. We derived the asymptotic distribution of the mixture test and studied its size and power using Monte Carlo simulations.

  17. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. The leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.

  18. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  19. Comparing entropy with tests for randomness as a measure of complexity in time series

    CERN Document Server

    Gan, Chee Chun


    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  20. Power-cooling mismatch test series. Test PCM-2A; test results report. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Cawood, G.W.; Holman, G.W.; Martinson, Z.R.; Legrand, B.L.


    The report describes the results of an in-pile experimental investigation of pre- and postcritical heat flux (CHF) behavior of a single 36-inch-long, pressurized water reactor (PWR) type, UO/sub 2/-fueled, zircaloy-clad fuel rod. The nominal coolant conditions for pressure and temperature were representative of those found in a commercial PWR. Nine separate departures from nucleate boiling (DNB) cycles were performed by either of two different methods: (a) decreasing the coolant flow rate while the fuel rod power was held constant, or (b) increasing the fuel rod power while the coolant flow rate was held constant. DNB was obtained during eight of the nine cycles performed. For the final flow reduction, the mass flux was decreased to 6.1 x 10/sup 5/ lb/hr-ft/sup 2/ at a constant peak linear heat generation rate of 17.8 kW/ft. The fuel rod was allowed to remain in film boiling for about 210 seconds during this final flow reduction. The fuel rod remained intact during the test. Results of on-line measurements of the fuel rod behavior are presented together with discussion of instrument performance. A comparison of the data with Fuel Rod Analysis Program-Transient 2 (FRAP-T2) computer code calculations is included.


    The Wet-Weather Flow Technologies Pilot of the EPA's Technology Verification (ETV) Program under a partnership with NSF International has verified the performawnce of the USFilter/Stranco Products chemical induction mixer used for disinfection of wet-weather flows. The USFilter t...

  2. The microcomputer scientific software series 4: testing prediction accuracy. (United States)

    H. Michael Rauscher


    A computer program, ATEST, is described in this combination user's guide / programmer's manual. ATEST provides users with an efficient and convenient tool to test the accuracy of predictors. As input ATEST requires observed-predicted data pairs. The output reports the two components of accuracy, bias and precision.

  3. An Intelligence Test Series for Blind and Low Vision Children. (United States)

    Dekker, R.; And Others


    This article summarizes results of statistical analyses of an intelligence test for 155 braille-educated blind and low-vision children, aged 6-15, in the Netherlands. Results indicate some accuracy in predicting academic achievement; factor analysis indicates 4 interpretable factors in children with and without usable vision. (Author/PB)

  4. Biostatistics Series Module 2: Overview of Hypothesis Testing. (United States)

    Hazra, Avijit; Gogtay, Nithya


    Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in

  5. Voltage verification unit (United States)

    Martin, Edward J [Virginia Beach, VA


    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  6. X-0557 modified Steven tests : series I and II /.

    Energy Technology Data Exchange (ETDEWEB)

    Straight, J. W. (James W.); Osborn, M. A. (Michael A.); Coulter, W. L. (William L.); Mang, J. T. (Joseph T.); Anderson, M. C. (Mark C.); Idar, D. J. (Deanne J.)


    Low-velocity mechanical impact leading to unintentional reaction is of concern in accident scenarios involving the handling, transport, and storage of high explosives (HE). These have been investigated using different experimental techniques, from small- to large-scale, including, but not limited to the drop weight impact, Taylor anvil impact, Susan,1 and more recently, the Steven and Modified Steven tests.2-8 Ideally, the data will be used to further advance 3-D finite element analysis predictive capability with improved bulk constitutive HE models for the assessment of HE response to mechanical insult. Our overall objectives for these experiments were to (1) evaluate the HE reaction threshold behavior for two different lots of X-0557, and (2) characterize the degree of reaction violence relative to a detonation. This report summarizes our single impact test results on the two different lots of X-0557 in Modified Steven targets.

  7. Removal of Arsenic, Iron, Manganese, and Ammonia in Drinking Water: Nagaoka International Corporation CHEMILES NCL Series Water Treatment System (United States)

    The Nagaoka International Corporation CHEMILES NCL Series system was tested to verify its performance for the reduction of multiple contaminants including: arsenic, ammonia, iron, and manganese. The objectives of this verification, as operated under the conditions at the test si...

  8. Testing the homogeneity of short-term surface solar radiation series in Europe (United States)

    Hakuba, Maria Z.; Sanchez-Lorenzo, Arturo; Folini, Doris; Wild, Martin


    Non-climatic factors, such as changes in instruments or the relocation of meteorological stations, can cause sudden shifts or gradual biases in a climate data time series. The use of such inhomogeneous time series in data analysis might lead to false conclusions about climate variability and change. In this work, we test the homogeneity of 172 surface solar radiation (SSR) monthly series over Europe available in the Global Energy Balance Archive (GEBA) during the period 2000-2007. Four absolute homogeneity tests are applied to each series, and a classification of inhomogeneous and homogeneous stations is given. The results show that 20 out of 172 series (11.6% of the total) are inhomogeneous at the 99% significance level. The mean average time series of both data sets, the original and the one with only the homogeneous series, show positive linear trends (0.59 and 0.70 Wm-2yr-1). The omission of the inhomogeneous series increases the original trend by 0.11 Wm-2yr-1 or 1.1 Wm-2decade-1. Our results highlight the importance of testing the homogeneity of SSR time series before any trend analysis is performed.

  9. Primary HPV testing verification: A retrospective ad-hoc analysis of screening algorithms on women doubly tested for cytology and HPV. (United States)

    Tracht, Jessica; Wrenn, Allison; Eltoum, Isam-Eldin


    To evaluate human papillomavirus (HPV) testing as a primary screening tool, we retrospectively analyzed data comparing (1) HPV testing to the algorithms of the ATHENA Study: (2) cytology alone, (3) cytology with ASCUS triage in women 25-29 and (4) cotesting ≥ 30 or (5) cotesting ≥ 25. We retrospectively analyzed data from women tested with both cytology and HPV testing from 2010 to 2013. Cumulative risk (CR) for CIN3+ was calculated. Crude and verification bias adjusted (VBA) sensitivity, specificity, predictive values, likelihood ratios, colposcopy rate, and screening test numbers were compared. About 15,173 women (25-95, 7.1% testing. Nearly 1,184 (8.4%) had biopsies. About 19.4% had positive cytology, 14.5% had positive HPV. HPV testing unassociated with ASCUS was requested in 40% of women testing per CIN3+ diagnosed. While HPV-/NILM cotesting results are associated with low CIN3+ risk, HPV testing had similar screening performance to cotesting and to cytology alone. Additionally, HPV testing and cytology incur false negatives in nonoverlapping subsets of patients. Diagn. Cytopathol. 2017;45:580-586. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures (United States)


    IV&V Independent Verification and Validation JTRS Joint Tactical Radio System JVMF Joint Variable Message Format LDAP Lightweight Directory Access...Protocol LDIF LDAP Data Interchange Format LSI Lead Systems Integrator LUT Limited User Test MCS Mounted Combat System / Mobility Computer System

  11. Perseus B Taxi Tests in Preparation for a New Series of Flight Tests (United States)


    The Perseus B remotely piloted aircraft taxis on the runway at Edwards Air Force Base, California, before a series of development flights at NASA's Dryden flight Research Center. The Perseus B is the latest of three versions of the Perseus design developed by Aurora Flight Sciences under NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. Perseus B is a remotely piloted aircraft developed as a design-performance testbed under NASA's Environmental Research Aircraft and Sensor Technology (ERAST) project. Perseus is one of several flight vehicles involved in the ERAST project. A piston engine, propeller-powered aircraft, Perseus was designed and built by Aurora Flight Sciences Corporation, Manassas, Virginia. The objectives of Perseus B's ERAST flight tests have been to reach and maintain horizontal flight above altitudes of 60,000 feet and demonstrate the capability to fly missions lasting from 8 to 24 hours, depending on payload and altitude requirements. The Perseus B aircraft established an unofficial altitude record for a single-engine, propeller-driven, remotely piloted aircraft on June 27, 1998. It reached an altitude of 60,280 feet. In 1999, several modifications were made to the Perseus aircraft including engine, avionics, and flight-control-system improvements. These improvements were evaluated in a series of operational readiness and test missions at the Dryden Flight Research Center, Edwards, California. Perseus is a high-wing monoplane with a conventional tail design. Its narrow, straight, high-aspect-ratio wing is mounted atop the fuselage. The aircraft is pusher-designed with the propeller mounted in the rear. This design allows for interchangeable scientific-instrument payloads to be placed in the forward fuselage. The design also allows for unobstructed airflow to the sensors and other devices mounted in the payload compartment. The Perseus B that underwent test and development in 1999 was the third generation of the Perseus

  12. State-of-the-art report for the testing and formal verification methods for FBD program

    Energy Technology Data Exchange (ETDEWEB)

    Jee, Eun Kyoung [KAIST, Daejeon (Korea, Republic of); Lee, Jang Soo; Lee, Young Jun [KAERI, Daejeon (Korea, Republic of); Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)


    The importance of PLC testing has increased in the nuclear I and C domain. While regulation authorities require both functional and structural testing for safety system software, FBD testing relies only on functional testing and there has been little research on structural testing techniques for FBD programs. We aim to analyze current techniques related to FBD testing and develop a structural testing technique appropriate to FBD programs. We developed structural test coverage criteria applicable to FBD programs, focusing on data paths from input edges to output edges of FBD programs. A data path condition (DPC), under which input data can flow into the output edge, is defined for each data path. We defined basic coverage, input condition coverage and complex condition coverage criteria based on the formal definition of DPC. We also developed a measurement procedure for FBD testing adequacy and a supporting tool prototype

  13. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes


    Kim, Jong-Bum; Jeong, Ji-Young; Lee, Tae-Ho; Kim, Sungkyun; Euh, Dong-Jin; Joo, Hyung-Kook


    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V&V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the co...

  14. Nist Microwave Blackbody: The Design, Testing, and Verification of a Conical Brightness Temperature Source (United States)

    Houtz, Derek Anderson

    maximized emissivity are fundamental to a well characterized blackbody. The chosen geometry is a microwave absorber coated copper cone. Electromagnetic and thermal simulations are introduced to optimize the design. Experimental verifications of the simulated quantities confirm the predicted performance of the blackbody.

  15. Perseus B Taxi Tests in Preparation for a New Series of Flight Tests (United States)


    The Perseus B remotely piloted aircraft taxis on the runway at Edwards Air Force Base, California, before a series of development flights at NASA's Dryden flight Research Center. The Perseus B is the latest of three versions of the Perseus design developed by Aurora Flight Sciences under NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. Perseus B is a remotely piloted aircraft developed as a design-performance testbed under NASA's Environmental Research Aircraft and Sensor Technology (ERAST) project. Perseus is one of several flight vehicles involved in the ERAST project. A piston engine, propeller-powered aircraft, Perseus was designed and built by Aurora Flight Sciences Corporation, Manassas, Virginia. The objectives of Perseus B's ERAST flight tests have been to reach and maintain horizontal flight above altitudes of 60,000 feet and demonstrate the capability to fly missions lasting from 8 to 24 hours, depending on payload and altitude requirements. The Perseus B aircraft established an unofficial altitude record for a single-engine, propeller-driven, remotely piloted aircraft on June 27, 1998. It reached an altitude of 60,280 feet. In 1999, several modifications were made to the Perseus aircraft including engine, avionics, and flight-control-system improvements. These improvements were evaluated in a series of operational readiness and test missions at the Dryden Flight Research Center, Edwards, California. Perseus is a high-wing monoplane with a conventional tail design. Its narrow, straight, high-aspect-ratio wing is mounted atop the fuselage. The aircraft is pusher-designed with the propeller mounted in the rear. This design allows for interchangeable scientific-instrument payloads to be placed in the forward fuselage. The design also allows for unobstructed airflow to the sensors and other devices mounted in the payload compartment. The Perseus B that underwent test and development in 1999 was the third generation of the Perseus

  16. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines (United States)

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  17. Fabrication and characterization of gelatin-based test materials for verification of trace contraband vapor detectors. (United States)

    Staymates, Jessica L; Gillen, Greg


    This work describes a method to produce inexpensive and field deployable test materials that can be used to verify the performance of trace contraband vapor detection systems such as ion mobility spectrometers (IMS) currently deployed worldwide for explosives, narcotics, and chemical warfare agent (CWA) detection. Requirements for such field deployable test materials include long shelf life, portability, and low manufacturing costs. Reported here is a method for fabricating these test materials using encapsulation of high vapor pressure compounds, such as methyl salicylate (MS), into a gelatin matrix. Gelatin serves as a diffusion barrier allowing for controlled and sustained release of test vapors. Test materials were prepared by incorporating serial dilutions of MS into gelatin, which provide controlled analyte vapor release over 3 to 4 orders of magnitude of instrument response. The test materials are simple to prepare and have been shown to be stable for at least one year under controlled laboratory conditions.

  18. Review of waste package verification tests. Semiannual report, April 1984-September 1984. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Jain, H.; Veakis, E.; Soo, P.


    This ongoing study is part of a task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report includes crushed tuff packing material for use in a high level waste tuff repository. A review of available tests to quantify packing performance is given together with recommendations for future testing work. 27 refs., 6 figs., 3 tabs.


    The report gives results of testing three fuels in a small (732 kW) firetube package boiler to determine emissions of carbon monoxide (CO), nitrogen oxide (NO), particulate matter (PM), and total hydrocarbons (THCs). The tests were part of EPA's Environmental Technology Verificat...

  20. 76 FR 17287 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing (United States)


    ... emission testing body requirements) to improve the accuracy of emissions data. EPA is also amending other... rule, adding two new definitions, revising certain compliance dates, and clarifying the language and...-Compliant Air Emission Testing Body (AETB) Names C. Other Amendments 1. Compliance Dates for Units Adding...

  1. Verification of yield functions by biaxial tensile tests with rotated principal axes (United States)

    Ageba, Ryo; Ishiwtari, Akinobu; Hiramoto, Jiro


    A yield function is a critical factor contributing to the accuracy of FEM simulation of steel sheet forming. Yld2000-2d by Barlat is an anisotropic yield function for shell elements. Uniaxial and biaxial tensile test are required to identify the parameters of the Yld2000-2d function. In tests, the principal axes of stresses are normally either parallel or orthogonal to the rolling direction. However, the principal axes of stresses of the material are randomly oriented in actual press forming. Therefore the actual material behavior may not be correctly expressed by a yield function identified from tests always conducted with the same principal axes directions. In this study, the accuracy of the anisotropic yield function is verified under biaxial stress with different principal axes in tests using specimens with rotated principal axes. The results confirm that the accuracy of Yld2000-2d is adequate and the identifying tests are reasonable.

  2. Power Performance Verification of a Wind Farm Using the Friedman's Test

    National Research Council Canada - National Science Library

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L


    .... This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded...

  3. Ultrasonic Testing of Thick Walled Austenitic Welds: Modelling and Experimental Verification (United States)

    Köhler, B.; Müller, W.; Spies, M.; Schmitz, V.; Zimmer, A.; Langenberg, K.-J.; Mletzko, U.


    The testing of austenitic welds is difficult due to the elastically anisotropic properties of the weld grains. Therefore the normal rules for the selection of testing conditions as appropriate wave modes, frequencies and incident angles cannot be applied in the usual way. In last years several tools for simulation of the wave propagation in such testing situations were developed. In the paper these tools are applied to a austenitic weld containing a crack grown by intergranular stress corrosion cracking (IGSCC). It is demonstrated that by the combined application of several simulation tools a stepwise narrowing of the parameter space can be achieved. Eventually an optimized testing configuration is defined. The approach is validated experimentally.

  4. A Strategy for Automatic Quality Signing and Verification Processes for Hardware and Software Testing

    Directory of Open Access Journals (Sweden)

    Mohammed I. Younis


    Circuits in a production line. Comparatively, our result demonstrates that the proposed strategy outperforms the traditional block partitioning strategy with the mutant score of 100% to 90%, respectively, with the same number of test cases.

  5. Fatigue life prediction of liquid rocket engine combustor with subscale test verification (United States)

    Sung, In-Kyung

    Reusable rocket systems such as the Space Shuttle introduced a new era in propulsion system design for economic feasibility. Practical reusable systems require an order of magnitude increase in life. To achieve this improved methods are needed to assess failure mechanisms and to predict life cycles of rocket combustor. A general goal of the research was to demonstrate the use of subscale rocket combustor prototype in a cost-effective test program. Life limiting factors and metal behaviors under repeated loads were surveyed and reviewed. The life prediction theories are presented, with an emphasis on studies that used subscale test hardware for model validation. From this review, low cycle fatigue (LCF) and creep-fatigue interaction (ratcheting) were identified as the main life limiting factors of the combustor. Several life prediction methods such as conventional and advanced viscoplastic models were used to predict life cycle due to low cycle thermal stress, transient effects, and creep rupture damage. Creep-fatigue interaction and cyclic hardening were also investigated. A prediction method based on 2D beam theory was modified using 3D plate deformation theory to provide an extended prediction method. For experimental validation two small scale annular plug nozzle thrusters were designed, built and tested. The test article was composed of a water-cooled liner, plug annular nozzle and 200 psia precombustor that used decomposed hydrogen peroxide as the oxidizer and JP-8 as the fuel. The first combustor was tested cyclically at the Advanced Propellants and Combustion Laboratory at Purdue University. Testing was stopped after 140 cycles due to an unpredicted failure mechanism due to an increasing hot spot in the location where failure was predicted. A second combustor was designed to avoid the previous failure, however, it was over pressurized and deformed beyond repair during cold-flow test. The test results are discussed and compared to the analytical and numerical

  6. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications. (United States)

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W


    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy. © IMechE 2014.

  7. Nondestructive Testing and Evaluation of Wood—50 Years of Research: International Nondestructive Testing and Evaluation of Wood Symposium Series (United States)

    Robert J. Ross; Xiping Wang


    The International Nondestructive Testing and Evaluation of Wood Symposium Series was initiated by Washington State University and the USDA Forest Products Laboratory (FPL) in 1963 with the convening of a symposium on the topic of nondestructive testing of wood at FPL. Including that meeting, 17 symposia have been held during the last 50 years at various sites around...

  8. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN


    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  9. Verification of Emulated Channels in Multi-Probe Based MIMO OTA Testing Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum


    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  10. The test verification of 3D geodetic points and their changes

    Directory of Open Access Journals (Sweden)

    Vincent Jakub


    Full Text Available Approaches of congruency checks of 3D point field realisations applying repeated measurements. Investigation of 3D point displacement in various space direction using test procedures. Determination possibilities of 3D point movements and their significance by the confidence ellipsoids and their applications in practice.

  11. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    National Research Council Canada - National Science Library

    Hernandez, Wilmar; López-Presa, José; Maldonado-Correa, Jorge


    .... This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out...

  12. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  13. Verification of Overall Safety Factors In Deterministic Design Of Model Tested Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.


    The paper deals with concepts of safety implementation in design. An overall safety factor concept is evaluated on the basis of a reliability analysis of a model tested rubble mound breakwater with monolithic super structure. Also discussed are design load identification and failure mode limit st...

  14. Test/QA Plan for Verification of Coliform Detection Technologies for Drinking Water (United States)

    The coliform detection technologies to be tested use chromatogenic and fluorogenic growth media to detect coliforms and E. coli based on the enzymatic activity of these organisms. The systems consist of single-use sample containers that contain pre-measured reagents and can be u...

  15. ALMA Science Verification (United States)

    Hills, R.


    As many of you are aware, ALMA has reached a very exciting point in the construction phase. After a year of testing the basic functionality of antennas and small arrays at the Chajnantor site at 5000m, we are now able to run full observations of scientific targets using at least 8 antennas and 4 receiver bands. We recently had a series of reviews of all aspects of the ALMA Project, resulting in a consensus that we will be ready to issue a Call for Proposals for Early Science projects at the end of the first quarter of 2011, with an expectation of starting these Early Science observations toward the end of 2011. ALMA Science Verification is the process by which we will demonstrate that the data that will be produced by ALMA during Early Science is valid. This is done by running full "end to end" tests of ALMA as a telescope. We will observe objects for which similar data are already available for other telescopes. This allows us to make direct quantitative comparisons of all aspects of the data cubes, in order to determine whether the ALMA instrumentation or software is introducing any artifacts.

  16. Verification of nerve integrity after surgical intervention using quantitative sensory testing. (United States)

    Said-Yekta, Sareh; Smeets, Ralf; Esteves-Oliveira, Marcella; Stein, Jamal M; Riediger, Dieter; Lampert, Friedrich


    The aim of this study was to apply a standardized Quantitative Sensory Testing (QST) approach in patients to investigate whether oral surgery can lead to sensory changes, even if the patients do not report any sensory disturbances. Furthermore, this study determines the degree and duration of possible neuronal hyperexcitability due to local inflammatory trauma after oral surgery. Orofacial sensory functions were investigated by psychophysical means in 60 patients (30 male, 30 female) in innervation areas of infraorbital nerves, mental nerves and lingual nerves after different interventions in oral surgery. The patients were tested 1 week, 4 weeks, 7 weeks, and 10 weeks postoperatively. As controls for bilateral sensory changes after unilateral surgery, tests were additionally performed in 20 volunteers who did not have any dental restorations. No differences were found between the control group and the control side of the patients. Although not 1 of the patients reported paresthesia or other sensory changes postoperatively, QST detected significant differences between the control and the test side in the mental and lingual regions. Test sides were significantly less sensitive for thermal parameters (cold, warm, and heat). No differences were found in the infraorbital region. Patients showed significantly decreased pain pressure thresholds on the operated side. QST monitored recovery over time in all patients. The results show that oral surgery can lead to sensory deficits in the mental and lingual region, even if the patients do not notice any sensory disturbances. The applied QST battery is a useful tool to investigate trigeminal nerve function in the early postoperative period. In light of the increasing forensic implication, this tool can serve to objectify clinical findings. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Aerodynamics and performance verifications of test methods for laboratory fume cupboards. (United States)

    Tseng, Li-Ching; Huang, Rong Fung; Chen, Chih-Chieh; Chang, Cheng-Ping


    The laser-light-sheet-assisted smoke flow visualization technique is performed on a full-size, transparent, commercial grade chemical fume cupboard to diagnose the flow characteristics and to verify the validity of several current containment test methods. The visualized flow patterns identify the recirculation areas that would inevitably exist in the conventional fume cupboards because of the fundamental configurations and structures. The large-scale vortex structures exist around the side walls, the doorsill of the cupboard and in the vicinity of the near-wake region of the manikin. The identified recirculation areas are taken as the 'dangerous' regions where the risk of turbulent dispersion of contaminants may be high. Several existing tracer gas containment test methods (BS 7258:1994, prEN 14175-3:2003 and ANSI/ASHRAE 110:1995) are conducted to verify the effectiveness of these methods in detecting the contaminant leakage. By comparing the results of the flow visualization and the tracer gas tests, it is found that the local recirculation regions are more prone to contaminant leakage because of the complex interaction between the shear layers and the smoke movement through the mechanism of turbulent dispersion. From the point of view of aerodynamics, the present study verifies that the methodology of the prEN 14175-3:2003 protocol can produce more reliable and consistent results because it is based on the region-by-region measurement and encompasses the most area of the entire recirculation zone of the cupboard. A modified test method combined with the region-by-region approach at the presence of the manikin shows substantially different results of the containment. A better performance test method which can describe an operator's exposure and the correlation between flow characteristics and the contaminant leakage properties is therefore suggested.


    This report reflects verification testing of a catalytic muffler for diesel trucks. Produced by Donaldson Corp., it was tested on low sulfur and ultra low sulfur fuel, and shown to have reduced emissions.

  19. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications (United States)

    Zlotnik, V.A.; McGuire, V.L.


    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial

  20. Hypersensitivity reactions to metallic implants-diagnostic algorithm and suggested patch test series for clinical use

    DEFF Research Database (Denmark)

    Schalock, Peter C; Menné, Torkil; Johansen, Jeanne D


    transformation tests, for hypersensitivity reactions to implanted metal devices. Patch test evaluation is the gold standard for metal hypersensitivity, although the results may be subjective. Regarding pre-implant testing, those patients with a reported history of metal dermatitis should be evaluated by patch...... testing. Those without a history of dermatitis should not be tested unless considerable concern exists. Regarding post-implant testing, a subset of patients with metal hypersensitivity may develop cutaneous or systemic reactions to implanted metals following implant. For symptomatic patients, a diagnostic...... algorithm to guide the selection of screening allergen series for patch testing is provided. At a minimum, an extended baseline screening series and metal screening is necessary. Static and dynamic orthopaedic implants, intravascular stent devices, implanted defibrillators and dental and gynaecological...

  1. Automated particulate sampler for Comprehensive Test Ban Treaty verification (the DOE radionuclide aerosol sampler/analyzer) (United States)

    Bowyer, S. M.; Miley, H. S.; Thompson, R. C.; Hubbard, C. W.


    The Comprehensive Test Ban Treaty (CTBT) was recently signed by President Clinton and is intended to eliminate all nuclear weapons testing. One way which the treaty seeks to accomplish this is by the establishment of the International Monitoring System. As stated in the latest Working Papers of the Draft CTBT, "The International Monitoring System shall comprise facilities for seismological monitoring, radionuclide monitoring including certified laboratories, hydroacoustic monitoring, infrasound monitoring, and respective means of communication, and shall be supported by the International Data Centre of the Technical Secretariat". Radionuclide monitoring consists of both radionuclides associated with particulates and relevant noble gases. This type of monitoring is quite valuable since indications of a nuclear test in the form of radioactive particulate or radioactive noble gases may be detected at great distances from the detonation site. The system presented here is concerned only with radioactive particulate monitoring and is described as an automated sampler/analyzer which has been developed for the Department of Energy (DoE) at the Pacific Northwest National Laboratory (PNNL).

  2. Design of a Portable Test Facility for the ATLAS Tile Calorimeter Front-End Electronics Verification

    CERN Document Server

    Kim, HY; The ATLAS collaboration; Carrio, F; Moreno, P; Masike, T; Reed, R; Sandrock, C; Schettino, V; Shalyugin, A; Solans, C; Souza, J; Suter, R; Usai, G; Valero, A


    An FPGA-based motherboard with an embedded hardware processor is used to implement a portable test- bench for the full certification of Tile Calorimeter front-end electronics in the ATLAS experiment at CERN. This upgrade will also allow testing future versions of the TileCal read-out electronics as well. Because of its lightness the new facility is highly portable, allowing on-detector validation using sophisticated algorithms. The new system comprises a front-end GUI running on an external portable computer which controls the motherboard. It also includes several dedicated daughter-boards that exercise the different specialized functionalities of the system. Apart from being used to evaluate different technologies for the future upgrades, it will be used to certify the consolidation of the electronics by identifying low frequency failures. The results of the tests presented here show that new system is well suited for the 2013 ATLAS Long Shutdown. We discuss all requirements necessary to give full confidence...

  3. Quantitative ultrasonic testing of acoustically anisotropic materials with verification on austenitic and dissimilar weld joints (United States)

    Boller, C.; Pudovikov, S.; Bulavinov, A.


    Austenitic stainless steel materials are widely used in a variety of industry sectors. In particular, the material is qualified to meet the design criteria of high quality in safety related applications. For example, the primary loop of the most of the nuclear power plants in the world, due to high durability and corrosion resistance, is made of this material. Certain operating conditions may cause a range of changes in the integrity of the component, and therefore require nondestructive testing at reasonable intervals. These in-service inspections are often performed using ultrasonic techniques, in particular when cracking is of specific concern. However, the coarse, dendritic grain structure of the weld material, formed during the welding process, is extreme and unpredictably anisotropic. Such structure is no longer direction-independent to the ultrasonic wave propagation; therefore, the ultrasonic beam deflects and redirects and the wave front becomes distorted. Thus, the use of conventional ultrasonic testing techniques using fixed beam angles is very limited and the application of ultrasonic Phased Array techniques becomes desirable. The "Sampling Phased Array" technique, invented and developed by Fraunhofer IZFP, allows the acquisition of time signals (A-scans) for each individual transducer element of the array along with fast image reconstruction techniques based on synthetic focusing algorithms. The reconstruction considers the sound propagation from each image pixel to the individual sensor element. For anisotropic media, where the sound beam is deflected and the sound path is not known a-priori, a novel phase adjustment technique called "Reverse Phase Matching" is implemented. By taking into account the anisotropy and inhomogeneity of the weld structure, a ray tracing algorithm for modeling the acoustic wave propagation and calculating the sound propagation time is applied. This technique can be utilized for 2D and 3D real time image reconstruction. The

  4. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Steed, Chad A [ORNL; Pullum, Laura L [ORNL


    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we build a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.

  5. Review of waste package verification tests. Semiannual report, October 1984-March 1985

    Energy Technology Data Exchange (ETDEWEB)

    Soo, P. (ed.)


    The potential of WAPPA, a second-generation waste package system code, to meet the needs of the regulatory community is analyzed. The analysis includes an indepth review of WAPPA`s individual process models and a review of WAPPA`s operation. It is concluded that the code is of limited use to the NRC in the present form. Recommendations for future improvement, usage, and implementation of the code are given. This report also describes the results of a testing program undertaken to determine the chemical environment that will be present near a high-level waste package emplaced in a basalt repository. For this purpose, low carbon 1020 steel (a current BWIP reference container material), synthetic basaltic groundwater and a mixture of bentonite and basalt were exposed, in an autoclave, to expected conditions some period after repository sealing (150{sup 0}C, {approx_equal}10.4 MPa). Parameters measured include changes in gas pressure with time and gas composition, variation in dissolved oxygen (DO), pH and certain ionic concentrations of water in the packing material across an imposed thermal gradient, mineralogic alteration of the basalt/bentonite mixture, and carbon steel corrosion behavior. A second testing program was also initiated to check the likelihood of stress corrosion cracking of austenitic stainless steels and Incoloy 825 which are being considered for use as waste container materials in the tuff repository program. 82 refs., 70 figs., 27 tabs.

  6. Verification of Ares I Liftoff Acoustic Environments via the Ares I Scale Model Acoustic Test (United States)

    Counter, Douglas D.; Houston, Janice D.


    Launch environments, such as Liftoff Acoustic (LOA) and Ignition Overpressure (IOP), are important design factors for any vehicle and are dependent upon the design of both the vehicle and the ground systems. The NASA Constellation Program had several risks to the development of the Ares I vehicle linked to LOA which are used in the development of the vibro-acoustic environments. The risks included cost, schedule and technical impacts for component qualification due to high predicted vibro-acoustic environments. One solution is to mitigate the environment at the component level. However, where the environment is too severe to mitigate at the component level, reduction of the launch environments is required. The Ares I Scale Model Acoustic Test (ASMAT) program was implemented to verify the predicted Ares I launch environments and to determine the acoustic reduction for the LOA environment with an above deck water sound suppression system. The test article included a 5% scale Ares I vehicle model, tower and Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments. The ASMAT results are compared to the Ares I LOA predictions and water suppression effectiveness results are presented.

  7. Verification of Ares I Liftoff Acoustic Environments via the Ares Scale Model Acoustic Test (United States)

    Counter, Douglas D.; Houston, Janice D.


    Launch environments, such as Liftoff Acoustic (LOA) and Ignition Overpressure (IOP), are important design factors for any vehicle and are dependent upon the design of both the vehicle and the ground systems. The NASA Constellation Program had several risks to the development of the Ares I vehicle linked to LOA which are used in the development of the vibro-acoustic environments. The risks included cost, schedule and technical impacts for component qualification due to high predicted vibro-acoustic environments. One solution is to mitigate the environment at the component level. However, where the environment is too severe to mitigate at the component level, reduction of the launch environments is required. The Ares I Scale Model Acoustic Test (ASMAT) program was implemented to verify the predicted Ares I launch environments and to determine the acoustic reduction for the LOA environment with an above deck water sound suppression system. The test article included a 5% scale Ares I vehicle model, tower and Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments. The ASMAT results are compared to the Ares I LOA predictions and water suppression effectiveness results are presented.


    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.


    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  9. Verification test for helium panel of cryopump for DIII-D advanced divertor

    Energy Technology Data Exchange (ETDEWEB)

    Baxi, C.B.; Laughon, G.J.; Langhorn, A.R.; Schaubel, K.M.; Smith, J.P.; Gootgeld, A.M.; Campbell, G.L. (General Atomics, San Diego, CA (United States)); Menon, M.M. (Oak Ridge National Lab., TN (United States))


    It is planned to install a cryogenic pump in the lower divertor portion of the D3-D tokamak with a pumping speed of 50000{ell}/s and an exhaust of 2670 Pa-{ell}/s (20 Torr-{ell}s). A coaxial counter flow configuration has been chosen for the helium panel of this cryogenic pump. This paper evaluates cooldown rates and fluid stability of this configuration. A prototypic test was performed at General Atomics (GA) to increase confidence in the design. It was concluded that the helium panel cooldown rate agreed quite well with analytical prediction and was within acceptable limits. The design flow rate proved stable and two-phase pressure drop can be predicted quite accurately. 8 refs., 5 figs., 1 tab.

  10. Adaptive support for aircraft panel testing: New method and its experimental verification on a beam structure (United States)

    Sachau, Delf; Baschke, Manuel


    Acoustic transmissibility of aircraft panels is measured in full-scale test rigs. The panels are supported at their frames. These boundary conditions do not take into account the dynamic influence of the fuselage, which is significant in the frequency range below 300 Hz. This paper introduces a new adaptive boundary system (ABS). It combines accelerometers and electrodynamic shakers with real-time signal processing. The ABS considers the dynamic effect of the fuselage on the panel. The frames are dominating the dynamic behaviour of a fuselage in the low-frequency range. Therefore, the new method is applied to a beam representing a frame of the aircraft structure. The experimental results are evaluated and the precision of the ABS is discussed. The theoretical apparent mass representing the cut-off part of a frame is calculated and compared with the apparent mass, as provided by the ABS. It is explained how the experimental set-up limits the precision of the ABS.

  11. Mass-additive modal test method for verification of constrained structural models (United States)

    Admire, John R.; Tinker, Michael L.; Ivey, Edward W.


    A method for deriving constrained or fixed-base modes and frequencies from free-free modes of a structure with mass-loaded boundaries is developed. Problems associated with design and development of test fixtures can be avoided with such an approach. The analytical methodology presented is used to assess applicability of the mass-additive method for three types of structures and to determine the accuracy of derived constrained modes and frequencies. Results show that mass loading of the boundaries enables local interface modes to be measured within a desired frequency bandwidth, thus allowing constrained modes to be derived with considerably fewer free-free modes than for unloaded boundaries. Good convergence was obtained for a simple beam and a truss-like Shuttle payload, both of which had well-spaced modes and stiff interface support structures. Slow convergence was obtained for a space station module prototype, a shell-like structure having high modal density.


    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ


    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  13. Verification of a Proposed Clinical Electroacoustic Test Protocol for Personal Digital Modulation Receivers Coupled to Cochlear Implant Sound Processors. (United States)

    Nair, Erika L; Sousa, Rhonda; Wannagot, Shannon

    Guidelines established by the AAA currently recommend behavioral testing when fitting frequency modulated (FM) systems to individuals with cochlear implants (CIs). A protocol for completing electroacoustic measures has not yet been validated for personal FM systems or digital modulation (DM) systems coupled to CI sound processors. In response, some professionals have used or altered the AAA electroacoustic verification steps for fitting FM systems to hearing aids when fitting FM systems to CI sound processors. More recently steps were outlined in a proposed protocol. The purpose of this research is to review and compare the electroacoustic test measures outlined in a 2013 article by Schafer and colleagues in the Journal of the American Academy of Audiology titled "A Proposed Electroacoustic Test Protocol for Personal FM Receivers Coupled to Cochlear Implant Sound Processors" to the AAA electroacoustic verification steps for fitting FM systems to hearing aids when fitting DM systems to CI users. Electroacoustic measures were conducted on 71 CI sound processors and Phonak Roger DM systems using a proposed protocol and an adapted AAA protocol. Phonak's recommended default receiver gain setting was used for each CI sound processor manufacturer and adjusted if necessary to achieve transparency. Electroacoustic measures were conducted on Cochlear and Advanced Bionics (AB) sound processors. In this study, 28 Cochlear Nucleus 5/CP810 sound processors, 26 Cochlear Nucleus 6/CP910 sound processors, and 17 AB Naida CI Q70 sound processors were coupled in various combinations to Phonak Roger DM dedicated receivers (25 Phonak Roger 14 receivers-Cochlear dedicated receiver-and 9 Phonak Roger 17 receivers-AB dedicated receiver) and 20 Phonak Roger Inspiro transmitters. Employing both the AAA and the Schafer et al protocols, electroacoustic measurements were conducted with the Audioscan Verifit in a clinical setting on 71 CI sound processors and Phonak Roger DM systems to

  14. [Verification of the double dissociation model of shyness using the implicit association test]. (United States)

    Fujii, Tsutomu; Aikawa, Atsushi


    The "double dissociation model" of shyness proposed by Asendorpf, Banse, and Mtücke (2002) was demonstrated in Japan by Aikawa and Fujii (2011). However, the generalizability of the double dissociation model of shyness was uncertain. The present study examined whether the results reported in Aikawa and Fujii (2011) would be replicated. In Study 1, college students (n = 91) completed explicit self-ratings of shyness and other personality scales. In Study 2, forty-eight participants completed IAT (Implicit Association Test) for shyness, and their friends (n = 141) rated those participants on various personality scales. The results revealed that only the explicit self-concept ratings predicted other-rated low praise-seeking behavior, sociable behavior and high rejection-avoidance behavior (controlled shy behavior). Only the implicit self-concept measured by the shyness IAT predicted other-rated high interpersonal tension (spontaneous shy behavior). The results of this study are similar to the findings of the previous research, which supports generalizability of the double dissociation model of shyness.

  15. SE and I system testability: The key to space system FDIR and verification testing (United States)

    Barry, Thomas; Scheffer, Terrance; Small, Lynn R.; Monis, Richard


    The key to implementing self-diagnosing design is a systems engineering task focused on design for testability concurrent with design for functionality. The design for testability process described here is the product of several years of DOD study and experience. Its application to the space station has begun on Work Package II under NASA and McDonnell direction. Other work package teams are being briefed by Harris Corporation with the hope of convincing them to embrace the process. For the purpose of this discussion the term testability is used to describe the systems engineering process by which designers can assure themselves and their reviewers that their designs are TESTABLE, that is they will support the downstream process of determining their functionality. Due to the complexity and density of present-day state-of-the-art designs, such as pipeline processors and high-speed integrated circuit technology, testability feature design is a critical requirement of the functional design process. A systematic approach to Space systems test and checkout as well as fault detection fault isolation reconfiguration (FDFIR) will minimize operational costs and maximize operational efficiency. An effective design for the testability program must be implemented by all contractors to insure meeting this objective. The process is well understood and technology is here to support it.

  16. Verification Test of the SURF and SURFplus Models in xRage: Part II

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The previous study used an underdriven detonation wave (steady ZND reaction zone profile followed by a scale invariant rarefaction wave) for PBX 9502 as a validation test of the implementation of the SURF and SURFplus models in the xRage code. Even with a fairly fine uniform mesh (12,800 cells for 100mm) the detonation wave profile had limited resolution due to the thin reaction zone width (0.18mm) for the fast SURF burn rate. Here we study the effect of finer resolution by comparing results of simulations with cell sizes of 8, 2 and 1 μm, which corresponds to 25, 100 and 200 points within the reaction zone. With finer resolution the lead shock pressure is closer to the von Neumann spike pressure, and there is less noise in the rarefaction wave due to fluctuations within the reaction zone. As a result the average error decreases. The pointwise error is still dominated by the smearing the pressure kink in the vicinity of the sonic point which occurs at the end of the reaction zone.

  17. Test-retest reliability of probe-microphone verification in children fitted with open and closed hearing aid tips. (United States)

    Kim, Hannah; Ricketts, Todd A


    To investigate the test-retest reliability of real-ear aided response (REAR) measures in open and closed hearing aid fittings in children using appropriate probe-microphone calibration techniques (stored equalization for open fittings and concurrent equalization for closed fittings). Probe-microphone measurements were completed for two mini-behind-the-ear (BTE) hearing aids which were coupled to the ear using open and closed eartips via thin (0.9 mm) tubing. Before probe-microphone testing, the gain of each of the test hearing aids was programmed using an artificial ear simulator (IEC 711) and a Knowles Electronic Manikin for Acoustic Research to match the National Acoustic Laboratories-Non-Linear, version 1 targets for one of two separate hearing loss configurations using an Audioscan Verifit. No further adjustments were made, and the same amplifier gain was used within each hearing aid across both eartip configurations and all participants. Probe-microphone testing included real-ear occluded response (REOR) and REAR measures using the Verifit's standard speech signal (the carrot passage) presented at 65 dB sound pressure level (SPL). Two repeated probe-microphone measures were made for each participant with the probe-tube and hearing aid removed and repositioned between each trial in order to assess intrasubject measurement variability. These procedures were repeated using both open and closed domes. Thirty-two children, ages ranging from 4 to 14 yr. The test-retest standard deviations for open and closed measures did not exceed 4 dB at any frequency. There was also no significant difference between the open (stored equalization) and closed (concurrent equalization) methods. Reliability was particularly similar in the high frequencies and was also quite similar to that reported in previous research. There was no correlation between reliability and age, suggesting high reliability across all ages evaluated. The findings from this study suggest that reliable probe

  18. Verification of Ceramic Structures (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit


    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  19. Laboratory testing and performance verification of the CHARIS integral field spectrograph (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; Jovanovic, Nemanja; McElwain, Michael W.; Takato, Naruhisa; Hayashi, Masahiko


    delivered to the Subaru telescope in April 2016. This paper is a report on the laboratory performance of the spectrograph, and its current status in the commissioning process so that observers will better understand the instrument capabilities. We will also discuss the lessons learned during the testing process and their impact on future high-contrast imaging spectrographs for wavefront control.

  20. Performance Evaluation and Quality Assurance Management during the Series Power Tests of LHC Main Lattice Magnets

    CERN Document Server

    Siemko, A


    Within the LHC magnet program a series production of superconducting dipoles and quadrupoles has recently been completed in industry and all magnets were cold tested at CERN. The main features of these magnets are: two-in-one structure, 56 mm aperture, two layer coils wound from 15.1 mm wide Nb-Ti cables, and all-polyimide insulation. This paper reviews the process of the power test quality assurance and performance evaluation, which was applied during the LHC magnet series tests. The main test results of magnets tested in both supercritical and superfluid helium, including the quench training, the conductor performance, the magnet protection efficiency and the electrical integrity are presented and discussed in terms of the design parameters and the requirements of the LHC project.

  1. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)


    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  2. Recommendation to include fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) in the European baseline patch test series

    DEFF Research Database (Denmark)

    Bruze, Magnus; Andersen, Klaus Ejner; Goossens, An


    BACKGROUND: The currently used fragrance mix in the European baseline patch test series (baseline series) fails to detect a substantial number of clinically relevant fragrance allergies. OBJECTIVE: To investigate whether it is justified to include hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyr...

  3. Predictive permeability model of faults in crystalline rocks; verification by joint hydraulic factor (JH) obtained from water pressure tests (United States)

    Barani, Hamidreza Rostami; Lashkaripour, Gholamreza; Ghafoori, Mohammad


    In the present study, a new model is proposed to predict the permeability per fracture in the fault zones by a new parameter named joint hydraulic factor (JH). JH is obtained from Water Pressure Test (WPT) and modified by the degree of fracturing. The results of JH correspond with quantitative fault zone descriptions, qualitative fracture, and fault rock properties. In this respect, a case study was done based on the data collected from Seyahoo dam site located in the east of Iran to provide the permeability prediction model of fault zone structures. Datasets including scan-lines, drill cores, and water pressure tests in the terrain of Andesite and Basalt rocks were used to analyse the variability of in-site relative permeability of a range from fault zones to host rocks. The rock mass joint permeability quality, therefore, is defined by the JH. JH data analysis showed that the background sub-zone had commonly fracture, whereas the fault core had permeability characteristics nearly as low as the outer damage zone, represented by 8 Lu (1.3 ×10-4 m 3/s) per fracture, with occasional peaks towards 12 Lu (2 ×10-4 m 3/s) per fracture. The maximum JH value belongs to the inner damage zone, marginal to the fault core, with 14-22 Lu (2.3 ×10-4-3.6 ×10-4 m 3/s) per fracture, locally exceeding 25 Lu (4.1 ×10-4 m 3/s) per fracture. This gives a proportional relationship for JH approximately 1:4:2 between the fault core, inner damage zone, and outer damage zone of extensional fault zones in crystalline rocks. The results of the verification exercise revealed that the new approach would be efficient and that the JH parameter is a reliable scale for the fracture permeability change. It can be concluded that using short duration hydraulic tests (WPTs) and fracture frequency (FF) to calculate the JH parameter provides a possibility to describe a complex situation and compare, discuss, and weigh the hydraulic quality to make predictions as to the permeability models and

  4. Hypersensitivity reactions to metallic implants-diagnostic algorithm and suggested patch test series for clinical use

    DEFF Research Database (Denmark)

    Schalock, Peter C; Menné, Torkil; Johansen, Jeanne D


    algorithm to guide the selection of screening allergen series for patch testing is provided. At a minimum, an extended baseline screening series and metal screening is necessary. Static and dynamic orthopaedic implants, intravascular stent devices, implanted defibrillators and dental and gynaecological......Cutaneous and systemic hypersensitivity reactions to implanted metals are challenging to evaluate and treat. Although they are uncommon, they do exist, and require appropriate and complete evaluation. This review summarizes the evidence regarding evaluation tools, especially patch and lymphocyte...... transformation tests, for hypersensitivity reactions to implanted metal devices. Patch test evaluation is the gold standard for metal hypersensitivity, although the results may be subjective. Regarding pre-implant testing, those patients with a reported history of metal dermatitis should be evaluated by patch...

  5. Testing and Demonstrating Speaker Verification Technology in Iraqi-Arabic as Part of the Iraqi Enrollment Via Voice Authentication Project (IEVAP) in Support of the Global War on Terrorism (GWOT)

    National Research Council Canada - National Science Library

    Withee, Jeffrey W; Pena, Edwin D


    This thesis documents the findings of an Iraqi-Arabic language test and concept of operations for speaker verification technology as part of the Iraqi Banking System in support of the Iraqi Enrollment...


    Energy Technology Data Exchange (ETDEWEB)



    Active well coincidence counter assays have been performed on uranium metal highly enriched in {sup 235}U. The data obtained in the present program, together with highly enriched uranium (HEU) metal data obtained in other programs, have been analyzed using two approaches, the standard approach and an alternative approach developed at BNL. Analysis of the data with the standard approach revealed that the form of the relationship between the measured reals and the {sup 235}U mass varied, being sometimes linear and sometimes a second-order polynomial. In contrast, application of the BNL algorithm, which takes into consideration the totals, consistently yielded linear relationships between the totals-corrected reals and the {sup 235}U mass. The constants in these linear relationships varied with geometric configuration and level of enrichment. This indicates that, when the BNL algorithm is used, calibration curves can be established with fewer data points and with more certainty than if a standard algorithm is used. However, this potential advantage has only been established for assays of HEU metal. In addition, the method is sensitive to the stability of natural background in the measurement facility.


    EPA‘s Environmental Technology Verification program is designed to further environmental protection by accelerating the acceptance and use of improved and cost effective technologies. This is done by providing high-quality, peer reviewed data on technology performance to those in...

  8. NASA Boeing 757 HIRF test series low power on-the-ground tests

    Energy Technology Data Exchange (ETDEWEB)

    Poggio, A.J.; Pennock, S.T.; Zacharias, R.A.; Avalle, C.A.; Carney, H.L. [National Aeronautics and Space Administration, Langley AFB, VA (United States). Langley Research Center


    The data acquisition phase of a program intended to provide data for the validation of computational, analytical, and experimental techniques for the assessment of electromagnetic effects in commercial transports; for the checkout of instrumentation for following test programs; and for the support of protection engineering of airborne systems has been completed. Funded by the NASA Fly-By-Light/ Power-By-Wire Program, the initial phase involved on-the-ground electromagnetic measurements using the NASA Boeing 757 and was executed in the LESLI Facility at the USAF Phillips Laboratory. The major participants in this project were LLNL, NASA Langley Research Center, Phillips Laboratory, and UIE, Inc. The tests were performed over a five week period during September through November, 1994. Measurements were made of the fields coupled into the aircraft interior and signals induced in select structures and equipment under controlled illumination by RF fields. A characterization of the ground was also performed to permit ground effects to be included in forthcoming validation exercises. This report and the associated test plan that is included as an appendix represent a definition of the overall on-the-ground test program. They include descriptions of the test rationale, test layout, and samples of the data. In this report, a detailed description of each executed test is provided, as is the data identification (data id) relating the specific test with its relevant data files. Samples of some inferences from the data that will be useful in protection engineering and EM effects mitigation are also presented. The test plan which guided the execution of the tests, a test report by UIE Inc., and the report describing the concrete pad characterization are included as appendices.

  9. Standard test method for exfoliation corrosion susceptibility in 2XXX and 7XXX Series Aluminum Alloys (EXCO Test)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This test method covers a procedure for constant immersion exfoliation corrosion (EXCO) testing of high-strength 2XXX and 7XXX series aluminum alloys. Note 1—This test method was originally developed for research and development purposes; however, it is referenced, in specific material specifications, as applicable for evaluating production material (refer to Section 14 on Precision and Bias). 1.2 This test method applies to all wrought products such as sheet, plate, extrusions, and forgings produced from conventional ingot metallurgy process. 1.3 This test method can be used with any form of specimen or part that can be immersed in the test solution. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  10. Patch testing with the European baseline series fragrance markers: A 2016 update. (United States)

    Ung, C Y; White, J M L; White, I R; Banerjee, P; McFadden, J P


    Fragrance contact allergy is common and is currently screened for with the European baseline series fragrance markers: Fragrance Mix 1, Fragrance Mix 2, Myroxylon pereirae and hydroxyisohexyl 3-cyclohexene carboxaldehyde. To investigate the validity of patch testing with these fragrance markers in detecting fragrance allergy to 26 individual fragrance substances for which cosmetic ingredient labelling is mandatory in the European Union. We conducted a retrospective review of the patch test records of all eczema patients who underwent testing with the European baseline series extended with the individual fragrance substances during the period 2015-2016. 359 patients (17.2%) reacted to one or more allergens from the labelled fragrance substance series and/or a fragrance marker from the European baseline series. The allergens that were positive with the greatest frequencies were oxidised linalool (154; 7.4%, 95% CI 6.3%-8.6%), oxidised limonene (89; 4.3%, 95% CI 3.4%-5.2%) and Evernia furfuracea (44; 2.1%, 95% CI 1.5%-2.8%). Of the 319 patients who reacted to any of the labelled fragrance substances, only 130 (40.8%) also reacted to a baseline series fragrance marker. The sensitivity of our history-taking for fragrance allergy was 25.7%. With the evolving trends in fragrance allergy patch testing with Fragrance Mix 1, Fragrance Mix 2, Myroxylon pereirae and hydroxyisohexyl 3-cyclohexene carboxaldehyde are no longer sufficient for screening for fragrance allergy. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Mars2020 Entry, Descent, and Landing Instrumentation 2 (MEDLI2) Do No Harm Test Series (United States)

    Swanson, Gregory; Santos, Jose; White, Todd; Bruce, Walt; Kuhl, Chris; Wright, Henry


    A total of seventeen instrumented thermal sensor plugs, eight pressure transducers, two heat flux sensors, and one radiometer are planned to be utilized on the Mars 2020 missions thermal protection system (TPS) as part of the Mars Entry, Descent, and Landing Instrumentation II (MEDLI2) project. Of the MEDLI2 instrumentation, eleven instrumented thermal plugs and seven pressure transducers will be installed on the heatshield of the Mars 2020 vehicle while the rest will be installed on the backshell. The goal of the MEDLI2 instrumentation is to directly inform the large performance uncertainties that contribute to the design and validation of a Mars entry system. A better understanding of the entry environment and TPS performance could lead to reduced design margins enabling a greater payload mass-fraction and smaller landing ellipses. To prove that the MEDLI2 system will not degrade the performance of the Mars 2020 TPS, an Aerothermal Do No Harm (DNH) test series was designed and conducted. Like Mars 2020s predecessor, Mars Science Laboratory (MSL), the heatshield material will be Phenolic Impregnated Carbon Ablator (PICA); the Mars 2020 entry conditions are enveloped by the MSL design environments, therefore the development and qualification testing performed during MEDLI is sufficient to show that the similar MEDLI2 heatshield instrumentation will not degrade PICA performance. However, given that MEDLI did not include any backshell instrumentation, the MEDLI2 team was required to design and execute a DNH test series utilizing the backshell TPS material (SLA-561V) with the intended flight sensor suite. To meet the requirements handed down from Mars 2020, the MEDLI2 DNH test series emphasized the interaction between the MEDLI2 sensors and sensing locations with the surrounding backshell TPS and substrucutre. These interactions were characterized by performing environmental testing of four 12 by 12 test panels, which mimicked the construction of the backshell TPS and

  12. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft (United States)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John


    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  13. Experiment data report: Gap Conductance Test Series, Test GC 1-3 postirradiation examination. [BWR

    Energy Technology Data Exchange (ETDEWEB)

    Murdock, B. A.


    The results of the postirradiation examination of four boiling water reactor type, zircaloy-clad, UO/sub 2/-fueled rods tested as part of the Thermal Fuels Behavior Program are discussed. These rods were employed in Gap Conductance Test GC 1-3 which was conducted to obtain experimental data from which test rod gap conductance values could be determined by both the steady state integral kdT and the power oscillation methods. The postirradiation examination results provided will aid in interpreting and understanding the experimental data obtained during Test GC 1-3 and in evaluating the effect of fuel behavior on the fuel rod thermal response and interpreted gap conductances. Fuel rod fill gas composition and pressure and rod power profiles are discussed. Evidence is presented showing that significant amounts of water had been present in two of the four fuel rods during testing. For the two fuel rods that remained intact during the test, measurements of fuel pellet-to-cladding gap, as well as the surface area of the fuel cracks at several axial locations are presented. A total effective radial gap is calculated and the fuel structure and porosity are analyzed.

  14. Operational Overview for UAS Integration in the NAS Project Flight Test Series 3 (United States)

    Valkov, Steffi B.; Sternberg, Daniel; Marston, Michael


    The National Aeronautics and Space Administration Unmanned Aircraft Systems Integration in the National Airspace System Project has conducted a series of flight tests intended to support the reduction of barriers that prevent unmanned aircraft from flying without the required waivers from the Federal Aviation Administration. The 2015 Flight Test Series 3, supported two separate test configurations. The first configuration investigated the timing of Detect and Avoid alerting thresholds using a radar equipped unmanned vehicle and multiple live intruders flown at varying encounter geometries. The second configuration included a surrogate unmanned vehicle (flown from a ground control station, with a safety pilot on board) flying a mission in a virtual air traffic control airspace sector using research pilot displays and Detect and Avoid advisories to maintain separation from live and virtual aircraft. The test was conducted over an eight-week span within the R-2508 Special Use Airspace. Over 200 encounters were flown for the first configuration, and although the second configuration was cancelled after three data collection flights, Flight Test 3 proved to be invaluable for the purposes of planning, managing, and execution of this type of integrated flight test.

  15. Verification Games: Crowd-Sourced Formal Verification (United States)




    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, S. K.; Martinson, Z. R.


    The results of the pretest analyses for Test RIA 1-4 are presented. Test RIA 1-4 consists of a 3x3 array of previously irradiated MAP! fuel rods. The rods have 5.7% enriched UO{sub 2} fuel in zircaloy-4 cladding with an average burnup of 5300 MWd/t. The objective for Test RIA 1-4 is to provide information regarding loss-of-coolable fuel rod geometry following RIA event for a radial-average peak fuel enthalpy equivalent to the present licensing criteria of 1172 J/g (280 cal/g UO{sub 2}). Radial averaged peak fuel enthalpies of 1172 J/g (280 cal/g) 1077 J/g {257 cal/g), and 978 J/g (234 cal/g) for the corner, side, and center fuel rods, respectively, are planned to be achieved during a 2.7 ms reactor period power burst. The results of the FRAP-T5 analyses indicate that all nine rods will fail within 26 ms from the start of the power burst due to pellet-cladding mechanical interaction. All of the rods will undergo partial fuel melting. All rods will operate under extended film boiling (>30 sec) conditions and about 70% of the cladding length is expected to be molten. Approximately 15% of the cladding thickness will be oxided. Fuel swelling due to fission gas release and melting combined with fuel and cladding fragmentation, will probably produce a complete coolant flow blockage within the flow shroud.

  17. AGARD Flight Test Instrumentation Series. Volume 15. Gyroscopic Instruments and Their Application to Flight Testing (United States)


    all flight test applications are on the market . Typical values for these errors are am 103 */h (2.46) 6f 16 gS (2.47) Envir onmentl influences such as...characterize the Foucault -modulated Schuler oscillation. i.e. the oscilla- tionof a freely swinging pendulum of length R on the rotating earth. The...usage. Platform systems now available on the market cen, for instance, no longer be operated if there is failure of one gyro. The sensors still

  18. Primin in the European standard patch test series for 20 years

    DEFF Research Database (Denmark)

    Zachariae, Claus; Engkilde, Kåre; Johansen, Jeanne Duus


    Primin was included in the European standard series (ESS) in 1984. In 2000, a primin-free variant of Primula obconica, the main source of contact allergy to primin, was introduced in the market. The aim of this study was to analyse the trends of primin allergy in 13 986 consecutively patch-tested...... allergy to primin was seen (P age groups. The frequency was 0.5% during 2000-2004. Contact allergy has been rare since 2000. The low frequency of positive patch test to primin does not support inclusion in the ESS in our region....

  19. Patch testing with rubber series in Europe: a critical review and recommendation. (United States)

    Warburton, Katharine L; Uter, Wolfgang; Geier, Johannes; Spiewak, Radoslaw; Mahler, Vera; Crépy, Marie-Noëlle; Schuttelaar, Marie Louise; Bauer, Andrea; Wilkinson, Mark


    Rubber additives constitute an important group of contact allergens, particularly in certain occupations. To collect information regarding the current practice of using a 'rubber series' in Europe, and discuss this against the background of evidence concerning the prevalence of allergy in order to derive a recommendation for a 'European rubber series'. The following were performed: (i) a survey targeting all members of the COST action 'StanDerm' consortium, (ii) analysis of rubber contact allergy data in the database of the European Surveillance System on Contact Allergies, and (iii) a literature review. Information from 13 countries was available, from one or several departments of dermatology, and occasionally occupational health. Apart from some substances tested only in single departments, a broad overlap regarding important allergens was evident, but considerable variation existed between departments. An up-to-date 'European rubber series' is recommended, with the exclusion of substances only of historical concern. A 'supplementary rubber series' containing allergens of less proven importance, requiring further analysis, is recommended for departments specializing in occupational contact allergy. These should be continually updated as new evidence emerges. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei


    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh functions. Then the deviations of the autocovariances based on these systematic samples from the corresponding autocovariances of the whole time series are calculated and the uniform asymptotic joint normality of these deviations over different systematic samples is obtained. With a double-order selection scheme, our test statistic is constructed by combining the deviations at different lags in the systematic samples. The null asymptotic distribution of the statistic proposed is derived and the consistency of the test is shown under fixed and local alternatives. Simulation studies demonstrate well-behaved finite sample properties of the method proposed. Comparisons with some existing tests in terms of power are given both analytically and empirically. In addition, the method proposed is applied to check the stationarity assumption of a chemical process viscosity readings data set.

  1. Patch and Prick Tests in Hand Eczema: Results of A Sixty Seven Patient Series

    Directory of Open Access Journals (Sweden)

    Bilge Fettahlıoğlu Karaman


    Full Text Available Objective: The patch and prick tests have a place in the management of patients with hand eczema. In this study, we investigated whether some of the clinical features patients with hand eczema could provide us with the predictability of skin test results. Methods: In Çukurova University Faculty of Medicine, 67 consecutive patients with hand eczema; evaluated in terms of duration of disease, morphology and severity. All of the patients were undergoes patch tested with the European Standard Series, and needle testing with routine aeroallergens. Results: Patch test with at least one allergen was positive in 46.3% of the patients; wheras this rate was 23.9% for prick test. The likelihood of having a contact sensitivity of patients complaining of hand eczema for at least three years was statistically more significant [odds ratio (OR 0.9]. Although statistically not significant, it is less likely to be sensitized to patients with keratotic and/or licheniform hand eczema (OR 0.3. The severity of hand eczema was not predictive of patch test, there was no indicator of needle test positivity. Conclusion: We strongly recommend patch testing in all patients with prolonged hand eczema.

  2. SB-LOCA beyond the design basis in a PWR experimental verification of am procedures in the PKL test facility

    Energy Technology Data Exchange (ETDEWEB)

    Mull, T.; Schoen, B.; Umminger, K.; Wegner, R. [Framatome ANP GmbH, Erlangen (Germany)


    The integral test facility PKL at the Technical Center of Framatome ANP (formerly Siemens/KWU) in Erlangen, Germany, simulates a 1300 MWe western type PWR. It is scaled by 1:145 in power and volume at original elevations. It features the entire primary side including four symmetrically arranged coolant loops and auxiliary and safety systems as well as the major part of the secondary side. The test series PKL III D, which was finished at the end of 1999, aimed at the exploration of safety margins and at the efficiency and optimization of operator initiated accident management (AM) procedures. Among others, several tests with small primary breaks combined with additional system failures were performed. This presentation describes test D3.1. The scenario under investigation was a small primary break (24 cm{sup 2} ) with simultaneous failure of the high pressure safety injection (HPSI), a beyond-design-basis scenario. For the German 1300 MWe PWRs, under such additional failure conditions, SB-LOCAs with leak sizes below 25 cm{sup 2} account for 18 % of the integral core damage frequency (CDF). This integral CDF can be estimated to be 3.1*10{sup -6} per year if no credit is taken from AM procedures. The break location in the test under consideration was in the cold leg between reactor coolant pump (RCP) and reactor pressure vessel (RPV). The assumed aggravating circumstances were HPSI failure and unavailability of 2 steam generators (SGs) as well as 3 out of 4 main steam relief and control valves (MS-RCV). The extra borating system was switched to injection mode at low pressurizer level but, by itself, would have been unable to maintain enough coolant to avoid core being uncovered before the pressure reached the setpoint of the accumulators (ACCs). The accident was managed by additional utilization of the chemical- and volume control system (CVCS) to inject water to partly neutralize the leak rate. The plant could be cooled down by 2 SGs using only one MS-RCV. The

  3. Approximate k-NN delta test minimization method using genetic algorithms: Application to time series

    CERN Document Server

    Mateo, F; Gadea, Rafael; Sovilj, Dusan


    In many real world problems, the existence of irrelevant input variables (features) hinders the predictive quality of the models used to estimate the output variables. In particular, time series prediction often involves building large regressors of artificial variables that can contain irrelevant or misleading information. Many techniques have arisen to confront the problem of accurate variable selection, including both local and global search strategies. This paper presents a method based on genetic algorithms that intends to find a global optimum set of input variables that minimize the Delta Test criterion. The execution speed has been enhanced by substituting the exact nearest neighbor computation by its approximate version. The problems of scaling and projection of variables have been addressed. The developed method works in conjunction with MATLAB's Genetic Algorithm and Direct Search Toolbox. The goodness of the proposed methodology has been evaluated on several popular time series examples, and also ...

  4. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  5. VISAR Validation Test Series at the Light Initiated High Explosive (LIHE) facility.

    Energy Technology Data Exchange (ETDEWEB)

    Covert, Timothy Todd


    A velocity interferometer system for any reflector (VISAR) was recently deployed at the light initiated high explosive facility (LIHE) to measure the velocity of an explosively accelerated flyer plate. The velocity data from the flyer plate experiments, using the vendor's fringe constant of 100m/s/fringe, were consistently lower than model predictions. The goal of the VISAR validation test series was to confirm the VISAR system fringe constant. A low velocity gas gun was utilized to impact and accelerate a target at the LIHE facility. VISAR velocity data from the accelerated target was compared against an independent velocity measurement. The data from this test series did in fact reveal the fringe constant was significantly higher than the vendor's specification. The correct fringe constant for the LIHE VISAR system has been determined to be 123 m/s/fringe. The Light Initiated High Explosive (LIHE) facility recently completed a Phase I test series to develop an explosively accelerated flyer plate (X-Flyer). The X-Flyer impulse technique consists of first spraying a thin layer of silver acetylide silver nitrate explosive onto a thin flyer plate. The explosive is then initiated using an intense flash of light. The explosive detonation accelerates the flyer across a small air gap towards the test item. The impact of the flyer with the test item creates a shock pulse and an impulsive load in the test unit. The goal of Phase I of the X-Flyer development series was to validate the technique theory and design process. One of the key parameters that control the shock pulse and impulsive load is the velocity of the flyer at impact. To measure this key parameter, a velocity interferometer system for any reflector (VISAR) was deployed at the LIHE facility. The VISAR system was assembled by Sandia personnel from the Explosive Projects and Diagnostics department. The VISAR was a three leg, push-pull system using a fixed delay cavity. The primary optical components

  6. Spectral-Based Volume Sensor Testbed Algorithm Development, Test Series VS2 (United States)


    Spectral-Based Volume Sensor SFA Smoke and Fire Alert, a VIDS product of Fastcom Technology, S.A. SigniFire a VIDS product of axonX, LLC SP Shortpass...189 through -253 VS3 Volume Sensor Test Series 3 VSD-8 Visual Smoke Detection System, a VIDS product of Fire Sentry Corp. Manuscript approved... subsitute for VS2-223), Smoldering Cables, FOV, !Trans PC1 PC2 PC3 PC4 PC5 PC6 PC7 PC8 _5900A

  7. Fuel Cycle Research and Development Accident Tolerant Fuels Series 1 (ATF-1) Irradiation Testing FY 2016 Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Core, Gregory Matthew [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    This report contains a summary of irradiation testing of Fuel Cycle Research and Development (FCRD) Accident Tolerant Fuels Series 1 (ATF 1) experiments performed at Idaho National Laboratory (INL) in FY 2016. ATF 1 irradiation testing work performed in FY 2016 included design, analysis, and fabrication of ATF-1B drop in capsule ATF 1 series experiments and irradiation testing of ATF-1 capsules in the ATR.

  8. The construction of environments for development of test and verification technology -The development of advanced instrumentation and control technology-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Shick; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Kim, Jae Hee; Lee, Chang Soo [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)


    Several problems were identified in digitalizing the I and C systems of NPPs. A scheme is divided into hardware and software to resolve these problems. Hardware verification and validation analyzed about common mode failure, commercial grade dedication process, electromagnetic competibility. We have reviewed codes and standards to be a consensus criteria among vendor, licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 in the United States Nuclear Regulatory Commision (NRC) and presented vendor`s approaches to scope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. (Author).

  9. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin


    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  10. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2 (United States)

    Platt, R.


    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.


    Directory of Open Access Journals (Sweden)

    Maria Dencheva


    Full Text Available The teeth and teeth rows restoration in the maxillofacial area is the last stage of the ongoing patient treatment and a basic purpose for the dental doctors. For this purpose a different set of modern and classic contemporary dental materials is used. The choice of each material during the treatment of every patient with proven allergy to different kind of allergens is very specific and strictly individual. In the everyday oral diagnostics a standardized set of allergens for diagnostics is used for proving the allergy to dental materials. The set has been developed on the base of all existing and permitted by the Bulgarian authorities dental materials, as well as professional series.The difference between the developed and standardized allergens for diagnostics used in our country and the existing ready-for-use series is that the first are made of the final product (material in the form introduced to the oral cavity and persisting there for a different period of time, sometimes for tenths of years. This enables the possibility for early or late contact allergic reactions with symptoms in the oral cavity and on the skin, maxillofacial area, head and neck, as well as the entire organism.The current article introduces the readers to the results obtained by the realization of the research project №28/2011 “Research on the type of sensibilisation to contemporary dental materials and development of set of allergens for its diagnosing through epicutaneous patch testing” funded by the Committee of Medical science of MU Sofia (CMC. Through the project became possible the creation and the initial research of the first Bulgarian series for epicutaneous testing whose aim is to prove the allergenic potential of the most frequently used by the dental doctors dental materials.

  12. Automatic Verification of Autonomous Robot Missions (United States)


    for a mission related to the search for a biohazard. Keywords: mobile robots, formal verification , performance guarantees, automatic translation 1...tested. 2 Related Work Formal verification of systems is critical when failure creates a high cost, such as life or death scenarios. A variety of...robot. 3.3 PARS Process algebras are specification languages that allow for formal verification of concurrent systems. Process Algebra for Robot

  13. Verification-based Software-fault Detection


    Gladisch, Christoph David


    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  14. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)


    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  15. Recent achievements on tests of series gyrotrons for W7-X and planned extension at the KIT gyrotron test facility

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, M., E-mail: [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Choudhury, A. Roy; Dammertz, G. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Erckmann, V. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Max-Planck-Institute for Plasmaphysics, Association EURATOM-IPP, Greifswald (Germany); Gantenbein, G.; Illy, S. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Jelonnek, J. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Institute of High Frequency Techniques and Electronics (IHE) (Germany); Kern, S. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Legrand, F. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Thales Electron Devices, Vélicy (France); Rzesnicki, T.; Samartsev, A. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Schlaich, A.; Thumm, M. [Karlsruhe Institute of Technology, Association EURATOM-KIT, Karlsruhe (Germany); Institute for Pulsed Power and Microwave Technology (IHM) (Germany); Institute of High Frequency Techniques and Electronics (IHE) (Germany)


    Highlights: ► Solution found to suppress parasitic beam tunnel oscillations on high power gyrotrons. ► Electron beam sweeping technique to avoid plastic deformation on collector of high power gyrotrons. ► Ongoing investigations on limitations of gyrotron efficiency. ► Upgrade of 10 MW CW modulator for gyrotrons with multistage depressed collectors. -- Abstract: Parasitic beam tunnel oscillations have been hampering the series production of gyrotrons for W7-X. This problem has now been overcome thanks to the introduction of a specially corrugated beam tunnel. Two gyrotrons equipped with the new beam tunnel have fully passed the acceptance tests. Despite excellent power capability, the expected efficiency has not yet been achieved, possibly due to the presence of parasitic oscillations suspected to be dynamic after-cavity-oscillations (ACI's) or due to insufficient electron beam quality. Both theoretical and experimental investigations on these topics are ongoing. On previous W7-X gyrotrons collector fatigue has been observed, not (yet) leading to any failures so far. The plastic deformation occurring on the collector has now been eliminated due to the strict use (on all gyrotrons) of a sweeping method which combines the conventional 7 Hz solenoid sweeping technique with a 50 Hz transverse-field sweep system. Starting in 2013, the gyrotron test facility at KIT will be enhanced, chiefly with a new 10 MW DC modulator, capable of testing gyrotrons up to 4 MW CW output power with multi-stage-depressed collectors.

  16. Acceptance tests and their results for 1st Pre-Series Cryoline (PTCL) of ITER (United States)

    Kapoor, H.; Garg, A.; Shah, N.; Muralidhara, S.; Choukekar, K.; Dash, B.; Gaur, V.; Madeenavalli, S.; Patel, P.; Kumar, U.; Jadon, M.; Shukla, V.; Sarkar, B.; Sarvaiya, Y.; Mukherjee, D.; Dutta, A.; Murugan, KV.; Gajera, S.; Joshi, B.; Panjwani, R.


    The Pre-Series Cryoline (PTCL) for ITER is a representative cryoline from the complicated network of all cryolines for the ITER project. It is ∼28 m in length with same cross-section (1:1) including main line (ML) and branch line (BL) as of ITER torus & cryostat cryoline. Geometrically; it has bends at different angles i.e. 90°, 120°, 135° & 160° comprising T-section & Z-section. The PTCL has been fabricated in 5 different elements based on the installation feasibility. The mechanical & instrumentation installation like sensors mounting, displacement sensors, etc. has been completed. The PTCL test has been performed after complete installation of PTCL and integration with the existing test facility at ITER-India cryogenics laboratory in order to verify the thermal performance and mechanical integrity. The primary objectives, which are evaluated during the PTCL test, are (i) Thermal performance of the PTCL (ii) Measurement of temperature profile on thermal shield of PTCL, (iii) Stress measurement at critical locations, (iv) Measurement of Outer Vacuum Jacket (OVJ) temperature during Break of Insulation Vacuum (BIV) test. The paper will summarize the methodology and observed results of PTCL.

  17. Frozen Gaussian series representation of the imaginary time propagator theory and numerical tests. (United States)

    Zhang, Dong H; Shao, Jiushu; Pollak, Eli


    Thawed Gaussian wavepackets have been used in recent years to compute approximations to the thermal density matrix. From a numerical point of view, it is cheaper to employ frozen Gaussian wavepackets. In this paper, we provide the formalism for the computation of thermal densities using frozen Gaussian wavepackets. We show that the exact density may be given in terms of a series, in which the zeroth order term is the frozen Gaussian. A numerical test of the methodology is presented for deep tunneling in the quartic double well potential. In all cases, the series is observed to converge. The convergence of the diagonal density matrix element is much faster than that of the antidiagonal one, suggesting that the methodology should be especially useful for the computation of partition functions. As a by product of this study, we find that the density matrix in configuration space can have more than two saddle points at low temperatures. This has implications for the use of the quantum instanton theory.

  18. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim


    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  19. A plan for application system verification tests: The value of improved meteorological information, volume 1. [economic consequences of improved meteorological information (United States)


    The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.

  20. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A. [comps.] [Oak Ridge National Lab., TN (United States)


    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasis was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.

  1. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta


    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  2. Complete Functional Verification


    Bormann, Joerg (Dr.)


    The dissertation describes a practically proven, particularly efficient approach for the verification of digital circuit designs. The approach outperforms simulation based verification wrt. final circuit quality as well as wrt. required verification effort. In the dissertation, the paradigm of transaction based verification is ported from simulation to formal verification. One consequence is a particular format of formal properties, called operation properties. Circuit descriptions are verifi...

  3. Testing deformation hypotheses by constraints on a time series of geodetic observations (United States)

    Velsink, Hiddo


    In geodetic deformation analysis observations are used to identify form and size changes of a geodetic network, representing objects on the earth's surface. The network points are monitored, often continuously, because of suspected deformations. A deformation may affect many points during many epochs. The problem is that the best description of the deformation is, in general, unknown. To find it, different hypothesised deformation models have to be tested systematically for agreement with the observations. The tests have to be capable of stating with a certain probability the size of detectable deformations, and to be datum invariant. A statistical criterion is needed to find the best deformation model. Existing methods do not fulfil these requirements. Here we propose a method that formulates the different hypotheses as sets of constraints on the parameters of a least-squares adjustment model. The constraints can relate to subsets of epochs and to subsets of points, thus combining time series analysis and congruence model analysis. The constraints are formulated as nonstochastic observations in an adjustment model of observation equations. This gives an easy way to test the constraints and to get a quality description. The proposed method aims at providing a good discriminating method to find the best description of a deformation. The method is expected to improve the quality of geodetic deformation analysis. We demonstrate the method with an elaborate example.

  4. Illustration of the WPS benefit through BATMAN test series: Tests on large specimens under WPS loading configurations

    Energy Technology Data Exchange (ETDEWEB)

    Yuritzinn, T.; Ferry, L.; Chapuliot, S.; Mongabure, P. [CEA, DEN/DANS/DM2S/SEMT/LISN, Nucl Engn Div, Syst and Struct Modeling Dept, F-91191 Gif Sur Yvette, (France); Moinereau, D.; Dahl, A. [EdF/MMC, F-77818 Moret Sur Loing, (France); Gilles, P. [AREVA-NP, F-92084 Paris, (France)


    To study the effects of warm pre-stressing on the toughness of reactor pressure vessel steel, the 'Commissariat a l Energie Atomique', in collaboration with 'Electricite de France' and AREVA-NP, has made a study combining modeling and a series of experiments on large specimens submitted to a thermal shock or isothermal cooling. The tests were made on 18MND5 ferritic steel bars, containing a short or large fatigue pre-crack. The effect of 'warm pre-stressing' was confirmed, in the two cases of a fast thermal shock creating a gradient across the thickness of the bar and for gradual uniform cooling. In both cases, no propagation was observed during the thermal transient. Fracture occurred under low temperature conditions, at the end of the test when the tensile load was increased. The failure loads recorded were substantially higher than during pre-stressing. To illustrate the benefit of the WPS effect, numerical interpretations were performed using either global approach or local approach criteria. WPS effect and capability of models to predict it were then clearly shown. (authors)

  5. Common families across test series—how many do we need? (United States)

    G.R. Johnson


    In order to compare families that are planted on different sites, many forest tree breeding programs include common families in their different series of trials. Computer simulation was used to examine how many common families were needed in each series of progeny trials in order to reliably compare families across series. Average gain and its associated variation...

  6. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS) (United States)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond


    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or

  7. Retrospective testing and case series study of porcine delta coronavirus in U.S. swine herds. (United States)

    McCluskey, Brian J; Haley, Charles; Rovira, Albert; Main, Rodger; Zhang, Yan; Barder, Sunny


    Porcine deltacoronavirus (PDCoV) was first reported in the United States (US) in February 2014. This was the second novel swine enteric coronavirus detected in the US since May 2013. In this study, we conducted retrospective testing of samples submitted to three veterinary diagnostic laboratories where qualifying biological samples were derived from previously submitted diagnostic case submissions from US commercial swine farms with a clinical history of enteric disease or from cases that had been previously tested for transmissible gastroenteritis virus, PEDV, or rotavirus. Overall, 2286 banked samples were tested from 27 States. Samples were collected in 3 separate years and in 17 different months. Test results revealed 4 positive samples, 3 collected in August 2013 and 1 collected in October 2013. In addition, a case series including 42 operations in 10 States was conducted through administration of a survey. Survey data collected included information on characteristics of swine operations that had experienced PDCoV clinical signs. Special emphasis was placed on obtaining descriptive estimates of biosecurity practices and disease status over time of each operation. Clinical signs of PDCoV were reported to be similar to those of PEDV. The average number of animals on each operation exhibiting clinical signs (morbidity) and the average number of case fatalities was greatest for suckling and weaned pigs. Average operation-level weaned pig morbidity was greatest in the first week of the outbreak while average operation-level suckling pig case fatality was greatest in the second week of the outbreak. The survey included questions regarding biosecurity practices for visitors and operation employees; trucks, equipment and drivers; and feed sources. These questions attempted to identify a likely pathway of introduction of PDCoV onto the operations surveyed. Published by Elsevier B.V.

  8. Testing Homeopathy in Mouse Emotional Response Models: Pooled Data Analysis of Two Series of Studies

    Directory of Open Access Journals (Sweden)

    Paolo Bellavite


    Full Text Available Two previous investigations were performed to assess the activity of Gelsemium sempervirens (Gelsemium s. in mice, using emotional response models. These two series are pooled and analysed here. Gelsemium s. in various homeopathic centesimal dilutions/dynamizations (4C, 5C, 7C, 9C, and 30C, a placebo (solvent vehicle, and the reference drugs diazepam (1 mg/kg body weight or buspirone (5 mg/kg body weight were delivered intraperitoneally to groups of albino CD1 mice, and their effects on animal behaviour were assessed by the light-dark (LD choice test and the open-field (OF exploration test. Up to 14 separate replications were carried out in fully blind and randomised conditions. Pooled analysis demonstrated highly significant effects of Gelsemium s. 5C, 7C, and 30C on the OF parameter “time spent in central area” and of Gelsemium s. 5C, 9C, and 30C on the LD parameters “time spent in lit area” and “number of light-dark transitions,” without any sedative action or adverse effects on locomotion. This pooled data analysis confirms and reinforces the evidence that Gelsemium s. regulates emotional responses and behaviour of laboratory mice in a nonlinear fashion with dilution/dynamization.

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1 (United States)

    Platt, R.


    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.


    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  11. Standard practice for verification of testing frame and specimen alignment under tensile and compressive axial force application

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 Included in this practice are methods covering the determination of the amount of bending that occurs during the application of tensile and compressive forces to notched and unnotched test specimens in the elastic range and to plastic strains less than 0.002. These methods are particularly applicable to the force application rates normally used for tension testing, creep testing, and uniaxial fatigue testing.

  12. Standard practice for verification of testing frame and specimen alignment under tensile and compressive axial force application

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 Included in this practice are methods covering the determination of the amount of bending that occurs during the application of tensile and compressive forces to notched and unnotched test specimens in the elastic range and to plastic strains less than 0.002. These methods are particularly applicable to the force application rates normally used for tension testing, creep testing, and uniaxial fatigue testing.

  13. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J


    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  14. Preliminary characterization of binary karst aquifers with tracer tests and time series analysis (United States)

    Ferrari, J. A.; Calux, A. S.; Hiruma, S. T.; Armani, G.; Karmann, I.


    The studied site is a polygonal karst developed in a synclinal structure in the Atlantic Rainforest, southeastern Brazil. The carbonatic surface (10.4 km2) receives allogenic recharge from drainage basins (13.9 km2) formed in psammitic rocks. Two main springs drains the karst on the opposite flanks of the synclinal: Alambari (AL) and Ouro Grosso (OG). The karst is inserted in a conservation unit and the hydrological investigation supports its management. Qualitative dye tracer tests were performed to identify recharge areas of the two springs. Monitoring stations at springs measured the water discharge (Q) and the specific conductance, (SC) every hour. The rainfall (R) was measured by a pluviometer connected to an event logger. The time series (2014 to 2016) were analyzed with autocorrelation (ACF) and cross-correlation functions (CCF) to compare the flow dynamics of both systems. Tracer tests indicate that AL spring drains most of the area. Field observations show that the main volume of perennial sink waters is related with this spring. The average values of the parameters from the hydrologic monitoring are: AL - Q= 0.6 m3/s, SC = 137.7 µS cm-1; OG - Q= 0.1 m3/s, SC=158.2 µS cm-1. The mean annual rainfall in the region is 1250 mm. The global analysis of Q (daily average) with ACF shows that memory effect in OG is 3 times higher than the obtained for AL. The same analysis for SC shows that the memory is 1.5 times higher in AL. The CCF was also used to analyze the relations between R, Q and SC time series (in hour basis). When analyzing CCF for R x Q, the maximum value occurs after 4 h for AL (r= 0.31) and after 3 h for OG (r= 0.25). Contrasting results were observed when CCF was applied for R x SC. The CCF for AL shows the usual behavior with a "negative peak" (after 13 h) that represents the pulse of fresh infiltrated rainwater, whereas OG shows a "negative peak" (after 2 h), followed by a 50 h peak (peaks identified with 99% of confidence intervals). The

  15. Patch Test Results with Standard and Cosmetic Series in Patients with Suspected Cosmetic-Induced Contact Dermatitis

    Directory of Open Access Journals (Sweden)

    Şenay Hacıoğlu


    Full Text Available Background and Design: Our aim was to evaluate the hypersensitivity to cosmetic chemicals in patients with clinically suspected cosmetic-induced contact dermatitis in Bursa and the South Marmara Region (Turkey by patch testing with standard and cosmetic series.Material and Method: Seventy-three patients with clinically suspected contact dermatitis due to cosmetics were patch tested by the European standard series and cosmetic series. The patch test results were analyzed as percentages. x2 test was used to demonstrate the relationship between cosmetic products and cosmetic allergens.Results: 90.4% of patients in our study group were female and 9.6% were male; the median age was 37.5 (range 16-71 years. The most commonly involved parts of the body were the face (49.3%, hands (16.4%, periocular region (6.8%, lips (6.8%, and the neck (5.5%. The most common offending cosmetic products causing allergic contact dermatitis were soaps and cleansing lotions (32.8%, moisturizer creams (21.9%, make-up (15.0%, and hair dyes (9.6%. 41.0% of patients showed positive reaction to at least one cosmetic allergen included in either standard or cosmetic series. The cosmetic allergens in the standard series and the rates of positivity were as follows: fragrance mix (6.8%, lanolin alcohols (5.5%, paraphenylenedaimine (2.7%, colophony (1.4%, paraben mix (1.4%, formaldehyde (1.4%, and methylchloroisothiazolinone (Kathon CG in descending order. The most common offending cosmetic allergen groups were preservatives (21.9%, antioxidants (8.2% and fragrances (6.8%. Conclusion: Allergic or irritant contact dermatitis due to cosmetics should be considered in cases of eczema involving face, neck, eyelids, lips, scalp or hands. Patch testing with cosmetic series beside standard series would be more helpful in detecting the responsible allergen(s.

  16. Z-2 Architecture Description and Requirements Verification Results (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard


    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag

  17. Metabolic and Subjective Results Review of the Integrated Suit Test Series (United States)

    Norcross, J.R.; Stroud, L.C.; Klein, J.; Desantis, L.; Gernhardt, M.L.


    Crewmembers will perform a variety of exploration and construction activities on the lunar surface. These activities will be performed while inside an extravehicular activity (EVA) spacesuit. In most cases, human performance is compromised while inside an EVA suit as compared to a crewmember s unsuited performance baseline. Subjects completed different EVA type tasks, ranging from ambulation to geology and construction activities, in different lunar analog environments including overhead suspension, underwater and 1-g lunar-like terrain, in both suited and unsuited conditions. In the suited condition, the Mark III (MKIII) EVA technology demonstrator suit was used and suit pressure and suit weight were parameters tested. In the unsuited conditions, weight, mass, center of gravity (CG), terrain type and navigation were the parameters. To the extent possible, one parameter was varied while all others were held constant. Tests were not fully crossed, but rather one parameter was varied while all others were left in the most nominal setting. Oxygen consumption (VO2), modified Cooper-Harper (CH) ratings of operator compensation and ratings of perceived exertion (RPE) were measured for each trial. For each variable, a lower value correlates to more efficient task performance. Due to a low sample size, statistical significance was not attainable. Initial findings indicate that suit weight, CG and the operational environment can have a large impact on human performance during EVA. Systematic, prospective testing series such as those performed to date will enable a better understanding of the crucial interactions of the human and the EVA suit system and their environment. However, work remains to be done to confirm these findings. These data have been collected using only unsuited subjects and one EVA suit prototype that is known to fit poorly on a large demographic of the astronaut population. Key findings need to be retested using an EVA suit prototype better suited to a

  18. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.


    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  19. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)


    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  20. Formulating and testing a method for perturbing precipitation time series to reflect anticipated climatic changes

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Georgiadis, Stylianos; Gregersen, Ida Bülow


    Urban water infrastructure has very long planning horizons, and planning is thus very dependent on reliable estimates of the impacts of climate change. Many urban water systems are designed using time series with a high temporal resolution. To assess the impact of climate change on these systems......, similarly high-resolution precipitation time series for future climate are necessary. Climate models cannot at their current resolutions provide these time series at the relevant scales. Known methods for stochastic downscaling of climate change to urban hydrological scales have known shortcomings...... in constructing realistic climate-changed precipitation time series at the sub-hourly scale. In the present study we present a deterministic methodology to perturb historical precipitation time series at the minute scale to reflect non-linear expectations to climate change. The methodology shows good skill...

  1. Operational lessons learned in conducting a multi-country collaboration for vaccine safety signal verification and hypothesis testing: The global vaccine safety multi country collaboration initiative. (United States)

    Guillard-Maure, Christine; Elango, Varalakshmi; Black, Steven; Perez-Vilar, Silvia; Castro, Jose Luis; Bravo-Alcántara, Pamela; Molina-León, Helvert Felipe; Weibel, Daniel; Sturkenboom, Miriam; Zuber, Patrick L F


    Timely and effective evaluation of vaccine safety signals for newly developed vaccines introduced in low and middle- income countries (LMICs) is essential. The study tested the development of a global network of hospital-based sentinel sites for vaccine safety signal verification and hypothesis testing. Twenty-six sentinel sites in sixteen countries across all WHO regions participated, and 65% of the sites were from LMIC. We describe the process for the establishment and operationalization of such a network and the lessons learned in conducting a multi-country collaborative initiative. 24 out of the 26 sites successfully contributed data for the global analysis using standardised tools and procedures. Our study successfully confirmed the well-known risk estimates for the outcomes of interest. The main challenges faced by investigators were lack of adequate information in the medical records for case ascertainment and classification, and access to immunization data. The results suggest that sentinel hospitals intending to participate in vaccine safety studies strengthen their systems for discharge diagnosis coding, medical records and linkage to vaccination data. Our study confirms that a multi-country hospital-based network initiative for vaccine safety monitoring is feasible and demonstrates the validity and utility of large collaborative international studies to monitor the safety of new vaccines introduced in LMICs. Copyright © 2017. Published by Elsevier Ltd.

  2. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator (United States)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses

  3. Modeling in the State Flow Environment to Support Launch Vehicle Verification Testing for Mission and Fault Management Algorithms in the NASA Space Launch System (United States)

    Trevino, Luis; Berg, Peter; England, Dwight; Johnson, Stephen B.


    Analysis methods and testing processes are essential activities in the engineering development and verification of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS). Central to mission success is reliable verification of the Mission and Fault Management (M&FM) algorithms for the SLS launch vehicle (LV) flight software. This is particularly difficult because M&FM algorithms integrate and operate LV subsystems, which consist of diverse forms of hardware and software themselves, with equally diverse integration from the engineering disciplines of LV subsystems. M&FM operation of SLS requires a changing mix of LV automation. During pre-launch the LV is primarily operated by the Kennedy Space Center (KSC) Ground Systems Development and Operations (GSDO) organization with some LV automation of time-critical functions, and much more autonomous LV operations during ascent that have crucial interactions with the Orion crew capsule, its astronauts, and with mission controllers at the Johnson Space Center. M&FM algorithms must perform all nominal mission commanding via the flight computer to control LV states from pre-launch through disposal and also address failure conditions by initiating autonomous or commanded aborts (crew capsule escape from the failing LV), redundancy management of failing subsystems and components, and safing actions to reduce or prevent threats to ground systems and crew. To address the criticality of the verification testing of these algorithms, the NASA M&FM team has utilized the State Flow environment6 (SFE) with its existing Vehicle Management End-to-End Testbed (VMET) platform which also hosts vendor-supplied physics-based LV subsystem models. The human-derived M&FM algorithms are designed and vetted in Integrated Development Teams composed of design and development disciplines such as Systems Engineering, Flight Software (FSW), Safety and Mission Assurance (S&MA) and major subsystems and vehicle elements

  4. Verification of the both hydrogeological and hydrogeochemical code results by an on-site test in granitic rocks

    Directory of Open Access Journals (Sweden)

    Michal Polák


    Full Text Available The project entitled “Methods and tools for the evaluation of the effect of engeneered barriers on distant interactions in the environment of a deep repository facility” deals with the ability to validate the behavior of applied engeneered barriers on hydrodynamic and migration parameters in the water-bearing granite environment of a radioactive waste deep repository facility. A part of the project represents a detailed mapping of the fracture network by means of geophysical and drilling surveys on the test-site (active granite quarry, construction of model objects (about 100 samples with the shape of cylinders, ridges and blocks, and the mineralogical, petrological and geochemical description of granite. All the model objects were subjected to migration and hydrodynamic tests with the use of fluorescein and NaCl as tracers. The tests were performed on samples with simple fractures, injected fractures and with an undisturbed integrity (verified by ultrasonic. The gained hydrodynamic and migration parameters of the model objects were processed with the modeling software NAPSAC and FEFLOW. During the following two years, these results and parameters will be verified (on the test-site by means of a long-term field test including the tuning of the software functionality.

  5. Verification Of Residual Strength Properties From Compression After Impact Tests On Thin CFRP Skin, A1 Honeycomb Composites (United States)

    Kalnins, Kaspars; Graham, Adrian J.; Sinnema, Gerben


    This article presents a study of CFRP/Al honeycomb panels subjected to a low velocity impact which, as a result, caused strength reduction. The main scope of the current study was to investigate experimental procedures, which are not well standardized and later verify them with numerical simulations. To ensure integrity of typical lightweight structural panels of modern spacecraft, knowledge about the impact energy required to produce clearly visible damage, and the resulting strength degradation is of high importance. For this initial investigation, Readily available ‘heritage’ (1980s) sandwich structure with relatively thin skin was used for this investigation. After initial attempts to produce impact damage, it was decided to create quasistatic indentation instead of low velocity impact, to cause barely visible damage. Forty two edgewise Compressions After Impact (CAI) test specimens have been produced and tested up to failure, while recording the strain distribution by optical means during the tests. Ultrasonic C-scan inspection was used to identify the damage evolution before and after each test. The optical strain measurements acquired during the tests showed sensitivity level capable to track the local buckling of damaged region.

  6. Interim Letter Report - Verification Survey Results for Activities Performed in March 2009 for the Vitrification Test Facility Warehouse at the West Valley Demonstration Project, Ashford, New York

    Energy Technology Data Exchange (ETDEWEB)

    B.D. Estes


    The objective of the verification activities was to provide independent radiological surveys and data for use by the Department of Energy (DOE) to ensure that the building satisfies the requirements for release without radiological controls.

  7. Full-Scale Experimental Verification of Soft-Story-Only Retrofits of Wood-Frame Buildings using Hybrid Testing (United States)

    Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld


    The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...

  8. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    Energy Technology Data Exchange (ETDEWEB)

    Onizuka, R [Graduate School of Health Sciences, Kumamoto University (Japan); Araki, F; Ohno, T [Faculty of Life Sciences, Kumamoto University (Japan); Nakaguchi, Y [Kumamoto University Hospital (Japan)


    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30% of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2 (United States)

    Platt, R.


    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  10. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, James W., LTC [Editor


    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  11. Anomalous transient uplift observed at the Lop Nor, China nuclear test site using satellite radar interferometry time-series analysis

    National Research Council Canada - National Science Library

    P. Vincent; S. M. Buckley; D. Yang; S. F. Carle


    ... is observed at the Lop Nor, China nuclear test site using ERS satellite SAR data. Using an InSAR time-series analysis method, we show that an increase in absolute uplift with time is observed between 1997 and 1999...

  12. Dropping the Devil's Advocate: One Novice Language Tester's Shifting Interactional Practices across a Series of Speaking Tests (United States)

    Leyland, Christopher; Greer, Tim; Rettig-Miki, Ellen


    This study employs longitudinal Conversation Analysis (CA) to examine one TA's follow-up contributions in a series of EFL group discussion tests. By tracking the TA's interactional practices across 18 groups, we observe how she adapts her turn design by increasingly aligning towards that of the novice English speakers. The TA initially attempts to…

  13. A 1:8.7 Scale Water Tunnel Verification & Validation Test of an Axial Flow Water Turbine

    Energy Technology Data Exchange (ETDEWEB)

    Fontaine, Arnold A. [Pennsylvania State Univ., University Park, PA (United States); Straka, William A. [Pennsylvania State Univ., University Park, PA (United States); Meyer, Richard S. [Pennsylvania State Univ., University Park, PA (United States); Jonson, Michael L. [Pennsylvania State Univ., University Park, PA (United States)


    As interest in waterpower technologies has increased over the last few years, there has been a growing need for a public database of measured data for these devices. This would provide a basic understanding of the technology and means to validate analytic and numerical models. Through collaboration between Sandia National Laboratories, Penn State University Applied Research Laboratory, and University of California, Davis, a new marine hydrokinetic turbine rotor was designed, fabricated at 1:8.7-scale, and experimentally tested to provide an open platform and dataset for further study and development. The water tunnel test of this three-bladed, horizontal-axis rotor recorded power production, blade loading, near-wake characterization, cavitation effects, and noise generation. This report documents the small-scale model test in detail and provides a brief discussion of the rotor design and an initial look at the results with comparison against low-order modeling tools. Detailed geometry and experimental measurements are released to Sandia National Laboratories as a data report addendum.

  14. Short communication: Parentage verification of South African ...

    African Journals Online (AJOL)

    Short communication: Parentage verification of South African Angora goats, using microsatellite markers. ... South African Journal of Animal Science. Journal Home ... Eighteen markers were tested in 192 South African Angora goats representing different family structures with known and unknown parent information.

  15. Seismic design verification of LMFBR structures

    Energy Technology Data Exchange (ETDEWEB)


    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  16. TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)


    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  17. BIOMEX Experiment: Ultrastructural Alterations, Molecular Damage and Survival of the Fungus Cryomyces antarcticus after the Experiment Verification Tests (United States)

    Pacelli, Claudia; Selbmann, Laura; Zucconi, Laura; De Vera, Jean-Pierre; Rabbow, Elke; Horneck, Gerda; de la Torre, Rosa; Onofri, Silvano


    The search for traces of extinct or extant life in extraterrestrial environments is one of the main goals for astrobiologists; due to their ability to withstand stress producing conditions, extremophiles are perfect candidates for astrobiological studies. The BIOMEX project aims to test the ability of biomolecules and cell components to preserve their stability under space and Mars-like conditions, while at the same time investigating the survival capability of microorganisms. The experiment has been launched into space and is being exposed on the EXPOSE-R2 payload, outside of the International Space Station (ISS) over a time-span of 1.5 years. Along with a number of other extremophilic microorganisms, the Antarctic cryptoendolithic black fungus Cryomyces antarcticus CCFEE 515 has been included in the experiment. Before launch, dried colonies grown on Lunar and Martian regolith analogues were exposed to vacuum, irradiation and temperature cycles in ground based experiments (EVT1 and EVT2). Cultural and molecular tests revealed that the fungus survived on rock analogues under space and simulated Martian conditions, showing only slight ultra-structural and molecular damage.

  18. Design of a Kaplan turbine for a wide range of operating head -Curved draft tube design and model test verification- (United States)

    KO, Pohan; MATSUMOTO, Kiyoshi; OHTAKE, Norio; DING, Hua


    As for turbomachine off-design performance improvement is challenging but critical for maximising the performing area. In this paper, a curved draft tube for a medium head Kaplan type hydro turbine is introduced and discussed for its significant effect on expanding operating head range. Without adding any extra structure and working fluid for swirl destruction and damping, a carefully designed outline shape of draft tube with the selected placement of center-piers successfully supresses the growth of turbulence eddy and the transport of the swirl to the outlet. Also, more kinetic energy is recovered and the head lost is improved. Finally, the model test results are also presented. The obvious performance improvement was found in the lower net head area, where the maximum efficiency improvement was measured up to 20% without compromising the best efficiency point. Additionally, this design results in a new draft tube more compact in size and so leads to better construction and manufacturing cost performance for prototype. The draft tube geometry parameter designing process was concerning the best efficiency point together with the off-design points covering various water net heads and discharges. The hydraulic performance and flow behavior was numerically previewed and visualized by solving Reynolds-Averaged Navier-Stokes equations with Shear Stress Transport turbulence model. The simulation was under the assumption of steady-state incompressible turbulence flow inside the flow passage, and the inlet boundary condition was the carefully simulated flow pattern from the runner outlet. For confirmation, the corresponding turbine efficiency performance of the entire operating area was verified by model test.

  19. Test/QA plan for the verification testing of diesel exhaust catalysts, particulate filters and engine modification control technologies for highway and nonroad use diesel engines (United States)

    This ETV test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research (DER) describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR Part 89 for nonroad engines, will be ...

  20. Test/QA plan for the verification testing of selective catalytic reduction control technologies for highway, nonroad use heavy-duty diesel engines (United States)

    This ETV test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research (DER) describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR Part 89 for nonroad engines, will be ...

  1. Scientific Verification Test of Orbitec Deployable Vegetable Production System for Salad Crop Growth on ISS- Gas Exchange System design and function (United States)

    Eldemire, Ashleigh


    The ability to produce and maintain salad crops during long term missions would be a great benefit to NASA; the renewable food supply would save cargo space, weight and money. The ambient conditions of previous ground controlled crop plant experiments do not reflect the microgravity and high CO2 concentrations present during orbit. It has been established that microgravity does not considerably alter plant growth. (Monje, Stutte, Chapman, 2005). To support plants in a space-craft environment efficient and effective lighting and containment units are necessary. Three lighting systems were previously evaluated for radish growth in ambient air; fluorescent lamps in an Orbitec Biomass Production System Educational (BPSE), a combination of red, blue, and green LED's in a Deployable Vegetable Production System (Veggie), and a combination of red and blue LED's in a Veggie. When mass measurements compared the entire possible growing area vs. power consumed by the respective units, the Veggies clearly exceeded the BPSE indicating that the LED units were a more resource efficient means of growing radishes under ambient conditions in comparison with fluorescent lighting. To evaluate the most productive light treatment system for a long term space mission a more closely simulated ISS environment is necessary. To induce a CO2 dense atmosphere inside the Veggie's and BPSE a gas exchange system has been developed to maintain a range of 1000-1200 ppm CO2 during a 21-day light treatment experiment. This report details the design and function of the gas exchange system. The rehabilitation, trouble shooting, maintenance and testing of the gas exchange system have been my major assignments. I have also contributed to the planting, daily measurements and harvesting of the radish crops 21-day light treatment verification test.

  2. The Spanish standard patch test series: 2016 update by the Spanish Contact Dermatitis and Skin Allergy Research Group (GEIDAC). (United States)

    Hervella-Garcés, M; García-Gavín, J; Silvestre-Salvador, J F


    The Spanish standard patch test series, as recommended by the Spanish Contact Dermatitis and Skin Allergy Research Group (GEIDAC), has been updated for 2016. The new series replaces the 2012 version and contains the minimum set of allergens recommended for routine investigation of contact allergy in Spain from 2016 onwards. Four haptens -clioquinol, thimerosal, mercury, and primin- have been eliminated owing to a low frequency of relevant allergic reactions, while 3 new allergens -methylisothiazolinone, diazolidinyl urea, and imidazolidinyl urea- have been added. GEIDAC has also modified the recommended aqueous solution concentrations for the 2 classic, major haptens methylchloroisothiazolinone and methylisothiazolinone, which are now to be tested at 200ppm in aqueous solution, and formaldehyde, which is now to be tested in a 2% aqueous solution. Updating the Spanish standard series is one of the functions of GEIDAC, which is responsible for ensuring that the standard series is suited to the country's epidemiological profile and pattern of contact sensitization. Copyright © 2016 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Results of patch testing to personal care product allergens in a standard series and a supplemental cosmetic series: an analysis of 945 patients from the Mayo Clinic Contact Dermatitis Group, 2000-2007. (United States)

    Wetter, David A; Yiannias, James A; Prakash, Amy V; Davis, Mark D P; Farmer, Sara A; el-Azhary, Rokea A


    Patch testing to a standard screening series of allergens in combination with supplemental cosmetic allergens is often used to diagnose allergic contact dermatitis due to personal care products. To report results of patch testing to skin care product allergens contained in a standard series and a supplemental cosmetic series and to compare efficacy of this combined series in detecting positive reactions to personal care product allergens with the efficacy of various standard screening series. Positive reaction rates to skin care product allergens were tabulated for patients who underwent patch testing to both standard and cosmetic series allergens at Mayo Clinic between 2000 and 2007. Data were compared with skin care allergens detected on standard screening series, including the thin-layer rapid use epicutaneous (TRUE) test. Of 945 patch-tested patients, 68.4% had at least one positive reaction and 47.3% had at least two positive reactions. Also, 49.4% of patients reacted to at least one preservative; 31.2% reacted to at least one fragrance/botanical additive. Compared with use of our standard series and cosmetic series, use of the TRUE test would have missed 22.5% of patients with preservative allergy, 11.3% with fragrance/botanical allergy, and 17.3% with vehicle allergy. Various allergens tested over time, patch test reading by residents, and lack of confirmation of allergen in personal care products. Standard patch-test screening series miss a substantial number of patients with skin care product ingredient allergy. Copyright © 2009 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  4. Testing for seasonal unit roots in monthly panels of time series

    NARCIS (Netherlands)

    R.M. Kunst (Robert); Ph.H.B.F. Franses (Philip Hans)


    textabstractWe consider the problem of testing for seasonal unit roots in monthly panel data. To this aim, we generalize the quarterly CHEGY test to the monthly case. This parametric test is contrasted with a new nonparametric test, which is the panel counterpart to the univariate RURS test that

  5. Verification and Validation of Flight Critical Systems Project (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  6. Verification of the Performance of a Vertical Ground Heat Exchanger Applied to a Test House in Melbourne, Australia

    Directory of Open Access Journals (Sweden)

    Koon Beng Ooi


    circulation pumps and fans require low power that can be supplied by photovoltaic thermal (PVT. The EnergyPlus™ v8.7 object modeling the PVT requires user-defined efficiencies, so a PVT will be tested in the experimental house.

  7. Visualization of and Software for Omnibus Test Based Change Detected in a Time Series of Polarimetric SAR Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning


    Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution and a factorization of this test statistic with associated p-values, change analysis in a time series of multilook polarimetric SAR data...... in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change occurs. Using airborne EMISAR and spaceborne RADARSAT-2 data this paper focuses on change detection based on the p-values, on visualization of change at pixel as well as segment level......, and on computer software....

  8. Online fingerprint verification. (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K


    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  9. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels


    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system....... The system shows promising performance on the database with an accuracy of 99.71% overall and is robust to the variations in parking areas and weather conditions....

  10. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  11. Testing the heterospecific attraction hypothesis with time-series data on species co-occurrence


    Sebastian-Gonzalez, E.; Sanchez-Zapata, J. A.; Botella, F.; Ovaskainen, O.


    The distributional patterns of actively moving animals are influenced by the cues that the individuals use for choosing sites into which they settle. Individuals may gather information about habitat quality using two types of strategies, either directly assessing the relevant environmental factors, or using the presence of conspecifics or heterospecifics as an indirect measure of habitat quality. We examined patterns of heterospecific attraction with observational time-series data on a commun...

  12. Gender verification in competitive sports. (United States)

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E


    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  13. Test of 6-in. -thick pressure vessels. Series 3: intermediate test vessel V-7. [PWR and BWR

    Energy Technology Data Exchange (ETDEWEB)

    Merkle, J.G.; Robinson, G.C.; Holz, P.P.; Smith, J.E.; Bryan, R.H.


    The test of intermediate test vessel V-7 was a crack-initiation fracture test of a 152-mm-thick (6-in.), 990-mm-OD (39-in.) vessel of ASTM A533, grade B, class 1 steel plate with a sharp outside surface flaw 457 mm (18 in.) long and about 135 mm (5.3 in.) deep. The vessel was heated to 91/sup 0/C (196/sup 0/F) and pressurized hydraulically until leakage through the flaw terminated the test at a peak pressure of 147 MPa (21,350 psi). Fracture toughness data obtained by testing precracked Charpy-V and compact-tension specimens machined from a prolongation of the cylindrical test shell were used in pretest analyses of the flawed vessel. The vessel, as expected, did not burst. Upon depressurization, the ruptured ligament closed so as to maintain static pressure without leakage at about 129 MPa (18,700 psi).

  14. Testing a series of causal propositions relating time in child care to children’s externalizing behavior


    McCartney, Kathleen Ann; Burchinal, Margaret; Bud, Kristen L.; Owen, Margaret T.; Belsky, Jay; Clarke-Stewart, Alison


    Prior research has documented associations between child care hours and children’s externalizing behavior. A series of longitudinal analyses were conducted to address five propositions, each testing the hypothesis that child care hours causes externalizing behavior. Data from the NICHD Study of Early Child Care and Youth Development were used in this investigation, because they include repeated measures of child care experiences, externalizing behavior, and family characteristics. There were ...

  15. Effective Strategies for Dealing with Test Anxiety. Teacher to Teacher Series. (United States)

    Collins, Lisa

    Test anxiety is exceedingly common among learners in adult basic education. Any one or more of the following can cause individuals to experience test anxiety: learned behavior resulting from the expectations of parents, teachers, or significant others; associations that students have built between grades or test performance and personal worth;…

  16. Essentials of WJ III[TM] Tests of Achievement Assessment. Essentials of Psychological Assessment Series. (United States)

    Mather, Nancy; Wendling, Barbara J.; Woodcock, Richard W.

    The widely used Woodcock Johnson (WJ) Test of Achievement has been separated into two distinct tests, Achievement and Cognitive Abilities. This book is designed to help busy mental health professionals acquire the knowledge and skills they need to use the third revision of the WJ Tests of Achievement (WJ III ACH) , including administration,…

  17. Optimal planning of series resistor to control time constant of test circuit for high-voltage AC circuit-breakers

    Directory of Open Access Journals (Sweden)

    Yoon-Ho Kim


    Full Text Available The equivalent test circuit that can deliver both short-circuit current and recovery voltage is used to verify the performance of high-voltage circuit breakers. Most of the parameters in this circuit can be obtained by using a simple calculation or a simulation program. The ratings of the circuit breaker include rated short-circuit breaking current, rated short-circuit making current, rated operating sequence of the circuit breaker and rated short-time current. Among these ratings, the short-circuit making capacity of the circuit breaker is expressed in peak value and not in RMS value similar to breaking capacity. A series resistor or super-excitation is used to control the peak value of the short-circuit current in the equivalent test circuit. When using a series resistor, a higher rating of circuit breakers leads to a higher thermal capacity, thereby requiring additional space. Therefore, an effective, optimal design of the series resistor is essential. This paper proposes a method for reducing thermal capacity and selecting the optimal resistance to limit the making current by controlling the DC time constant of the test circuit.

  18. 40 CFR 86.1847-01 - Manufacturer in-use verification and in-use confirmatory testing; submittal of information and... (United States)


    ... Compliance Provisions for Control of Air Pollution From New and In-Use Light-Duty Vehicles, Light-Duty Trucks... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Manufacturer in-use verification and... 86.1847-01 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS...

  19. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology (United States)

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  20. AGARD Flight Test Instrumentation Series. Volume 13. Practical Aspects of Instrumentation System Installation (United States)


    Harford House, 7-9 Charlotte St, London, WIP IHD PREFACE Soon after its feunding i 1952, the Advisory Group for Aerospace Research and Development...sensitivity of -90 dOn results in a signal strength margin of 34.2 dB. This wculd indicate that nulls of 10 dB would still leave a usable margin oZ 24.2...1973. B5 White, Donald R. J., "A Handbook Series On Electromagnetic/Interference and Compatibility," First Edition, Published in USA by Don White

  1. Guidance and Control Software Project Data - Volume 3: Verification Documents (United States)

    Hayhurst, Kelly J. (Editor)


    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  2. Results from Nevada Nuclear Waste Storage Investigations (NNWSI) Series 3 spent fuel dissolution tests

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.N.


    The dissolution and radionuclide release behavior of spent fuel in groundwater is being studied by the Yucca Mountain Project (YMP), formerly the Nevada Nuclear Waste Storage Investigations (NNWSI) Project. Specimens prepared from pressurized water reactor fuel rod segments were tested in sealed stainless steel vessels in Nevada Test Site J-13 well water at 85{degree}C and 25{degree}C. The test matrix included three specimens of bare-fuel particles plus cladding hulls, two fuel rod segments with artificially defected cladding and water-tight end fittings, and an undefected fuel rod section with watertight end fittings. Periodic solution samples were taken during test cycles with the sample volumes replenished with fresh J-13 water. Test cycles were periodically terminated and the specimens restarted in fresh J-13 water. The specimens were run for three cycles for a total test duration of 15 months. 22 refs., 32 figs., 26 tabs.

  3. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed


    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  4. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis. (United States)

    Mohammed, Emad A; Naugler, Christopher


    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  5. Verification Account Management System (VAMS) (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  6. Verification is experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik


    The formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. Although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  7. Verification Is Experimentation!

    NARCIS (Netherlands)

    Brinksma, Hendrik


    the formal verification of concurrent systems is usually seen as an example par excellence of the application of mathematical methods to computer science. although the practical application of such verification methods will always be limited by the underlying forms of combinatorial explosion, recent

  8. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S


    Full Text Available the performance of innovative environmental technologies can be verified by qualified third parties called "Verification Bodies". The "Statement of Verification" delivered at the end of the ETV process can be used as evidence that the claims made about...

  9. Test Anxiety: Theory, Assessment, and Treatment. The Series in Clinical and Community Psychology. (United States)

    Spielberger, Charles D., Ed.; Vagg, Peter R., Ed.

    It is not surprising that a broad array of treatment programs have been developed to reduce test anxiety, since the consequences can be serious. The contributions in this volume review and evaluate the theory of test anxiety, its measurement, its manifestations, and possible treatments and their outcomes. The following chapters are included: (1)…

  10. Statistical Considerations in Choosing a Test Reliability Coefficient. ACT Research Report Series, 2012 (10) (United States)

    Woodruff, David; Wu, Yi-Fang


    The purpose of this paper is to illustrate alpha's robustness and usefulness, using actual and simulated educational test data. The sampling properties of alpha are compared with the sampling properties of several other reliability coefficients: Guttman's lambda[subscript 2], lambda[subscript 4], and lambda[subscript 6]; test-retest reliability;…

  11. Standard test method for determining the susceptibility to intergranular corrosion of 5XXX series Aluminum alloys by mass loss after exposure to nitric acid (NAMLT Test)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This test method describes a procedure for constant immersion intergranular corrosion testing of 5XXX series aluminum alloys. 1.2 This test method is applicable only to wrought products. 1.3 This test method covers type of specimen, specimen preparation, test environment, and method of exposure. 1.4 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  12. Testing the Effectiveness of Cognitive Analytic Therapy for Hypersexuality Disorder: An Intensive Time-Series Evaluation. (United States)

    Kellett, Stephen; Simmonds-Buckley, Mel; Totterdell, Peter


    The evidence base for treatment of hypersexuality disorder (HD) has few studies with appropriate methodological rigor. This study therefore conducted a single case experiment of cognitive analytic therapy (CAT) for HD using an A/B design with extended follow-up. Cruising, pornography usage, masturbation frequency and associated cognitions and emotions were measured daily in a 231-day time series. Following a three-week assessment baseline (A: 21 days), treatment was delivered via outpatient sessions (B: 147 days), with the follow-up period lasting 63 days. Results show that cruising and pornography usage extinguished. The total sexual outlet score no longer met caseness, and the primary nomothetic hypersexuality outcome measure met recovery criteria. Reduced pornography consumption was mediated by reduced obsessionality and greater interpersonal connectivity. The utility of the CAT model for intimacy problems shows promise. Directions for future HD outcome research are also provided.

  13. Pesticide patch test series for the assessment of allergic contact dermatitis among banana plantation workers in panama. (United States)

    Penagos, Homero; Ruepert, Clemens; Partanen, Timo; Wesseling, Catharina


    Irritant contact dermatitis and allergic contact dermatitis (ACD) are frequent among agricultural workers and require targeted interventions. Patch testing is necessary for differential diagnosis, but patch testing with pesticides is uncommon. This study explores the frequency of ACD and sensitization to pesticides among highly exposed banana plantation workers. Frequently and recently used pesticides on banana plantations in Divala, Panama, were documented. A pesticide patch test tray specific for this population was prepared. A structured interview was administered to 366 participants, followed by a complete skin examination. The pesticide patch test series, as well as a standard patch test series, was applied to 37 workers with dermatoses likely to be pesticide related and to 23 control workers without dermatoses. The pesticide patch tests identified 15 cases (41%) of ACD (20 positive reactions) among the 37 workers diagnosed with pesticide dermatosis. Three controls had allergic reactions to pesticides (4 positive reactions). The pesticides were carbaryl (5 cases), benomyl (4 cases), ethoprophos (3), chlorothalonil (2), imazalil (2), glyphosate (2), thiabendazole (2), chlorpyrifos (1), oxyfluorfen (1), propiconazole (1), and tridemorph (1). Ethoprophos and tridemorph had not been previously identified as sensitizers. Thus, the prevalence of ACD was 0.03 (15 of 366). On the basis of observed prevalences of positive patch-test reactions among the subgroups with and without dermatoses, we estimated that > or = 16% of the entire population may be sensitized to pesticides. Sensitization to pesticides among banana plantation workers is a frequent occupational health problem. Pesticide patch test trays should be used in assessing skin diseases in highly exposed workers.

  14. Automated continuous verification for numerical simulation

    Directory of Open Access Journals (Sweden)

    P. E. Farrell


    Full Text Available Verification is a process crucially important for the final users of a computational model: code is useless if its results cannot be relied upon. Typically, verification is seen as a discrete event, performed once and for all after development is complete. However, this does not reflect the reality that many geoscientific codes undergo continuous development of the mathematical model, discretisation and software implementation. Therefore, we advocate that in such cases verification must be continuous and happen in parallel with development: the desirability of their automation follows immediately. This paper discusses a framework for automated continuous verification of wide applicability to any kind of numerical simulation. It also documents a range of test cases to show the possibilities of the framework.

  15. Profile of GMAT® Testing: North American Report. Testing Years 2010 through 2014. GMAC® Data-to-Go Series (United States)

    Chisholm, Alex


    This Data-to-Go brief summarizes five year GMAT testing trends for US and Canadian residents, and race/ethnicity breakdowns for US citizens. It includes: (1) GMAT exams taken by US region, US state of residence, and race/ethnicity of examinees (US citizens only), (2) GMAT exams taken by Canadian residents, by Canadian province, (3) GMAT exams…

  16. Profile of GMAT® Testing: Citizenship Report: Testing Years 2010 through 2014. GMAC® Data-to-Go Series (United States)

    Chisholm, Alex


    This report summarizes five-year global GMAT testing trends and includes: (1) GMAT exams taken by citizenship; (2) GMAT exams taken by gender; (3) Mean age of GMAT examinees; (4) Mean GMAT Total score; and (5) GMAT score-sending breakdowns by program type (MBA, non-MBA master's, and doctoral/other), TY2014. This data brief can help build candidate…

  17. Abrasion Testing of Products Containing Nanomaterials, SOP-R-2: Scientific Operating Procedure Series: Release (R) (United States)


    Related Documents ASTM C 153-07 - Abrasion Resistance of Dimension Stone Subjected to Foot Traffic Using a Rotary Platform, Double-Head Abraser ASTM D...the process of wearing away by friction . Abrader - wear testing instrument to evaluate abrasion resistance; also referred to as an abraser. Abrasion...permits a return to its starting position. In the case of the rotary platform test method, it consists of one complete rotation of the specimen. ERDC

  18. Statistical test for dynamical nonstationarity in observed time-series data

    CERN Document Server

    Kennel, M B


    Information in the time distribution of points in a state space reconstructed from observed data yields a test for ``nonstationarity''. Framed in terms of a statistical hypothesis test, this numerical algorithm can discern whether some underlying slow changes in parameters have taken place. The method examines a fundamental object in nonlinear dynamics, the geometry of orbits in state space, with corrections to overcome difficulties in real dynamical data which cause naive statistics to fail.

  19. Verification and Validation Studies for the LAVA CFD Solver (United States)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.


    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  20. Profile of GMAT® Testing: Residence Report. Testing Years 2010 through 2014. GMAC® Data-to-Go Series (United States)

    Chisholm, Alex


    This report summarizes five-year global GMAT testing trends and includes: (1) GMAT exams taken by residence; (2) GMAT exams taken by gender; (3) Mean age of GMAT examinees; (4) Mean GMAT Total score; and (5) GMAT score-sending breakdowns by program type (MBA, non-MBA master's, and doctoral/other), TY 2014. This data brief can be used to help build…

  1. Testing the heterospecific attraction hypothesis with time-series data on species co-occurrence (United States)

    Sebastián-González, Esther; Sánchez-Zapata, José Antonio; Botella, Francisco; Ovaskainen, Otso


    The distributional patterns of actively moving animals are influenced by the cues that the individuals use for choosing sites into which they settle. Individuals may gather information about habitat quality using two types of strategies, either directly assessing the relevant environmental factors, or using the presence of conspecifics or heterospecifics as an indirect measure of habitat quality. We examined patterns of heterospecific attraction with observational time-series data on a community of seven waterbird species breeding in artificial irrigation ponds. We fitted to the data a multivariate logistic regression model, which attributes the presence–absence of each species to a set of environmental and spatial covariates, to the presence of con- and heterospecifics in the previous year and to the presence of heterospecifics in the same year. All species showed a clear tendency to continue breeding in the same sites where they were observed in the previous year. Additionally, the presence of heterospecifics, both in the previous year and in the same year, generally increased the probability that the focal species was found breeding on a given pond. Our data thus give support for the heterospecific attraction hypothesis, though causal inference should be confirmed with manipulative experiments. PMID:20462909

  2. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series (United States)

    Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.


    he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.

  3. 10 years of patch testing with the (meth)acrylate series. (United States)

    Kanerva, L; Jolanki, R; Estlander, T


    Statistics on 10 years of patch testing with 30 (meth)acrylates were compiled. Altogether 275 patients were patch tested and 48 patients (17.5%) had an allergic reaction to at least 1 (meth)acrylate. The (meth)acrylates most often provoking an allergic patch test reaction were 2-hydroxyethyl acrylate (2-HEA; 12.1%), 2-hydroxypropyl methacrylate (2-HPMA; 12.0%) and 2-hydroxyethyl methacrylate (2-HEMA; 11.4%). No allergic reactions were caused by 2-ethylhexyl acrylate (2-EHA), 2,2-bis[4-(methacryloxy)phenyl]propane (BIS-MA), trimethylolpropane triacrylate (TMPTA), oligotriacrylate 480 (OTA 480), N,N-methylenebisacrylamide (MBAA), or ethyl cyanoacrylate (ECA). The frequency of allergic patch test reactions presented cannot be considered as a "ranking" list of the most sensitizing (meth)acrylate compounds. In order to be able to judge the sensitization capacity of various (meth)acrylate compounds in humans, it would be necessary to have detailed information on the exposure history of the patients studied, including the purity of the (meth)acrylate compounds. Currently, this is not possible because (meth)acrylate-containing products regularly contain undeclared (meth)acrylate compounds.

  4. Testing for a Common Volatility Process and Information Spillovers in Bivariate Financial Time Series Models

    NARCIS (Netherlands)

    J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)


    textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),

  5. Some basic tests on time series outliers | Olewuezi | Journal of the ...

    African Journals Online (AJOL)

    In the Peirce's criterion, the result of two outliers were opposed by the Chauvenet's criterion and Grubb's Test because Peirce's criterion accounts for the case where there is more than one suspect data point at once. Journal of the Nigerian Association of Mathematical Physics, Volume 15 (November, 2009), pp 101 - 106 ...

  6. Design, Fabrication, and Testing of a Scalable Series Augmented Railgun Research Platform (United States)


    higher velocity shots was grainy but retained the aluminum metallic tone whereas the root radius of the non-augmented shots was obscured by... blackened deposits. Although the current levels experienced in this testing are far less than the 900 kA threshold for root radius melting observed by

  7. 77 FR 38282 - Final Test Guidelines; OCSPP 850 Series; Notice of Availability (United States)


    ... Assessment Division (7403M), Office of Pollution Prevention and Toxics, Environmental Protection Agency, 1200... Toxics Docket (OPPT Docket), Environmental Protection Agency Docket Center (EPA/DC), EPA West Bldg., Rm... Plants, Cyanobacteria, and Terrestrial Soil Core Microcosm. Group F--Field Test Data Reporting Guidelines...

  8. Software Verification of Orion Cockpit Displays (United States)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee


    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  9. Retrospective case series analysis of penicillin allergy testing in a UK specialist regional allergy clinic. (United States)

    Richter, A G; Wong, G; Goddard, S; Heslegrave, J; Derbridge, C; Srivastava, S; Diwakar, L; Huissoon, A P; Krishna, M T


    Penicillin allergy is the most common drug allergy. Skin testing for the major (PPL) and minor determinants (MDMs) of penicillin offers increased sensitivity and specificity over in vitro testing alone. Following a worldwide absence of reagents, a new kit was licensed in the UK in 2008 (Diater, Spain) and this report evaluates its use in a UK specialist allergy clinic. Prospective data on 50 consecutive patients tested with the new reagents were collected. The departmental protocol is adapted from the 2003 EAACI position paper. 14% (7/50) and 12% (6/50) of patients were diagnosed with immediate and non-immediate reactions respectively. The negative predictive value of the PPL and MDM reagents at the neat concentration for an immediate reaction was 93% (true negatives 37, false negatives 3). Two patients experienced systemic reactions to DPT in the absence of demonstrable specific IgE. None of the patients were diagnosed using skin prick testing alone or at lower concentrations of IDT. Five patients were diagnosed at the IDT stage and two at the DPT stage in the absence of demonstrable specific IgE. Six patients were diagnosed with non-immediate reactions, two on IDT alone and four following IDT and DPT. The new PPL and MDM determinants offer enhanced sensitivity when evaluating β-lactam hypersensitivity; however, there are limitations to the current testing regimens. The UK would benefit from local guidelines, which incorporate the new reagents and acknowledge the high amoxicillin prescription rate and the relatively lower specialist-to-patient ratio in this country.

  10. Guidelines for Formal Verification Systems (United States)


    This document explains the requirements for formal verification systems that are candidates for the NCSC’s Endorsed Tools List (ETL). This primarily intended for developers of verification systems to use in the development of production-quality formal verification systems. It explains...the requirements and the process used to evaluate formal verification systems submitted to the NCSC for endorsement.

  11. AGARD flight test techniques series. Volume 9: Aircraft exterior noise measurement and analysis techniques (United States)

    Heller, H.


    Testing and analysis techniques to measure aircraft noise primarily for purposes of noise certification as specified by the 'International Civil Aviation Organization', ICAO are described. The relevant aircraft noise certification standards and recommended practices are presented in detail for subsonic jet aircraft, for heavy and light propeller-driven aircraft, and for helicopters. The practical execution of conducting noise certification tests is treated in depth. The characteristics and requirements of the acoustic and non-acoustic instrumentation for data acquisition and data processing are discussed, as are the procedures to determine the special noise measures - effective perceived noise level (EPNL) and maximum overall A-weighted noise level (L sub pA,max) - that are required for the noise certification of different types of aircraft. The AGARDograph also contains an extensive, although selective, discussion of test and analysis techniques for more detailed aircraft noise studies by means of either flight experiments or full-scale and model-scale wind tunnel experiments. Appendices provide supplementary information.

  12. Mortality among Military Participants at the 1957 PLUMBBOB Nuclear Weapons Test Series and on Leukemia among Participants at the SMOKY Test (United States)

    Caldwell, Glyn G.; Zack, Matthew M.; Mumma, Michael T.; Falk, Henry; Heath, Clark W.; Till, John E.; Chen, Heidi; Boice, John D.


    Health effects following low doses of ionizing radiation are uncertain. Military veterans at the Nevada Test Site (NTS) during the SMOKY atmospheric nuclear weapons test in 1957 were reported to be at increased risk for leukemia in 1979, but this increase was not evaluated with respect to radiation dose. The SMOKY test was one of 30 tests in 1957 within the PLUMBBOB test series. These early studies led to public laws where atomic veterans could qualify for compensation for presumptive radiogenic diseases. A retrospective cohort study was conducted of 12,219 veterans at PLUMBBOB test series, including 3,020 at the SMOKY nuclear test. Mortality follow-up was through 2010 and observed causes of death were compared with expected causes based on general population rates. Radiation dose to red bone marrow was based on individual dose reconstructions, and Cox proportional hazards models were used to evaluate dose response for all leukemias other than chronic lymphocytic leukemia (non-CLL leukemia). Vital status was determined for 95.3% of the 12,219 veterans. The dose to red bone marrow was low (mean 3.2 mGy, maximum 500 mGy). Military participants at the PLUMBBOB nuclear test series remained relatively healthy after 53 years and died at a lower rate than the general population. In contrast, and in comparison with national rates, the SMOKY participants showed significant increases in all causes of death, respiratory cancer, leukemia, nephritis and nephrosis, and accidents, possibly related in part to lifestyle factors common to enlisted men who made up 81% of the SMOKY cohort. Compared with national rates, a statistically significant excess of non-CLL leukemia was observed among SMOKY participants (Standardized Mortality Ratio=1.89, 95% 1.24–2.75, n=27) but not among PLUMBBOB participants after excluding SMOKY (SMR=0.87, 95% 0.64–1.51, n=47). Leukemia risk, initially reported to be significantly increased among SMOKY participants, remained elevated, but this risk

  13. Multistage Adaptive Testing for a Large-Scale Classification Test: Design, Heuristic Assembly, and Comparison with Other Testing Modes. ACT Research Report Series, 2012 (6) (United States)

    Zheng, Yi; Nozawa, Yuki; Gao, Xiaohong; Chang, Hua-Hua


    Multistage adaptive tests (MSTs) have gained increasing popularity in recent years. MST is a balanced compromise between linear test forms (i.e., paper-and-pencil testing and computer-based testing) and traditional item-level computer-adaptive testing (CAT). It combines the advantages of both. On one hand, MST is adaptive (and therefore more…

  14. MIDDLE NORTH Series Pre-DICE THROW I, II and DICE THROW Test Execution Report (United States)


    the Test Bed Area 111-169 3-102 SRI Remote Transmitter Located on North Oscura Peak 111-170 3-103 SRI Repeater Station and Receiving Station 111-172 3...upper-skin outside and inside surfaces as well as on the web of tile rib which made up the outboard boundary of panel number one. Pressure gages were...SRI, one at North Oscura Peak and the other at their receiver station. The intercom units were placed in office trailers, instrumentation vans and shop

  15. Standard Verification System (SVS) (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  16. SSN Verification Service (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  17. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.


    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  18. Fiscal 1997 report of the development of high efficiency waste power generation technology. No.2 volume. Pilot plant verification test; Kokoritsu haikibutsu hatsuden gijutsu kaihatsu (pilot plant jissho shiken). 1997 nendo hokokusho (daini bunsatsu)

    Energy Technology Data Exchange (ETDEWEB)



    As to a high efficiency waste power generation system using general waste as fuel, the details of the following were described: design/construction management and operational study of pilot plant, design/manufacture/construction of pilot plant, and study of an optimal total system. Concerning the construction management and operational study, the paper described the application for governmental/official inspection procedures and taking inspection, process management of pilot plant, site patrol, safety management, management of trial run of pilot plant, drawing-up of a verification test plan and test run, etc. Relating to the design/manufacture/construction of pilot plant, an outline of the pilot plant was described. The paper also stated points to be considered in design of furnace structure and boiler structure, points to be considered of the verification test, etc. As to the study of an optimal total system, the following were described: survey of waste gasification/slagging power generation technology, basic study on RDF production process, survey of trends of waste power generation technology in the U.S., etc. 52 refs., 149 figs., 121 tabs.

  19. Single-Event Transient Testing of Low Dropout PNP Series Linear Voltage Regulators (United States)

    Adell, Philippe; Allen, Gregory


    As demand for high-speed, on-board, digital-processing integrated circuits on spacecraft increases (field-programmable gate arrays and digital signal processors in particular), the need for the next generation point-of-load (POL) regulator becomes a prominent design issue. Shrinking process nodes have resulted in core rails dropping to values close to 1.0 V, drastically reducing margin to standard switching converters or regulators that power digital ICs. The goal of this task is to perform SET characterization of several commercial POL converters, and provide a discussion of the impact of these results to state-of-the-art digital processing IC through laser and heavy ion testing

  20. Effects of automated alerts on unnecessarily repeated serology tests in a cardiovascular surgery department: a time series analysis

    Directory of Open Access Journals (Sweden)

    Zapletal Eric


    Full Text Available Abstract Background Laboratory testing is frequently unnecessary, particularly repetitive testing. Among the interventions proposed to reduce unnecessary testing, Computerized Decision Support Systems (CDSS have been shown to be effective, but their impact depends on their technical characteristics. The objective of the study was to evaluate the impact of a Serology-CDSS providing point of care reminders of previous existing serology results, embedded in a Computerized Physician Order Entry at a university teaching hospital in Paris, France. Methods A CDSS was implemented in the Cardiovascular Surgery department of the hospital in order to decrease inappropriate repetitions of viral serology tests (HBV. A time series analysis was performed to assess the impact of the alert on physicians' practices. The study took place between January 2004 and December 2007. The primary outcome was the proportion of unnecessarily repeated HBs antigen tests over the periods of the study. A test was considered unnecessary when it was ordered within 90 days after a previous test for the same patient. A secondary outcome was the proportion of potentially unnecessary HBs antigen test orders cancelled after an alert display. Results In the pre-intervention period, 3,480 viral serology tests were ordered, of which 538 (15.5% were unnecessarily repeated. During the intervention period, of the 2,095 HBs antigen tests performed, 330 unnecessary repetitions (15.8% were observed. Before the intervention, the mean proportion of unnecessarily repeated HBs antigen tests increased by 0.4% per month (absolute increase, 95% CI 0.2% to 0.6%, p p = 0.02 resulting in a stable proportion of unnecessarily repeated HBs antigen tests. A total of 380 unnecessary tests were ordered among 500 alerts displayed (compliance rate 24%. Conclusions The proportion of unnecessarily repeated tests immediately dropped after CDSS implementation and remained stable, contrasting with the significant

  1. Current status of verification practices in clinical biochemistry in Spain. (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè


    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  2. Verification of Maximal Oxygen Uptake in Obese and Nonobese Children. (United States)

    Bhammar, Dharini M; Stickford, Jonathon L; Bernhardt, Vipa; Babb, Tony G


    The purpose of this study was to examine whether a supramaximal constant-load verification test at 105% of the highest work rate would yield a higher V˙O2max when compared with an incremental test in 10- to 12-yr-old nonobese and obese children. Nine nonobese (body mass index percentile = 57.5 ± 23.2) and nine obese (body mass index percentile = 97.9 ± 1.4) children completed a two-test protocol that included an incremental test followed 15 min later by a supramaximal constant-load verification test. The V˙O2max achieved in verification testing (nonobese = 1.71 ± 0.31 L·min and obese = 1.94 ± 0.47 L·min) was significantly higher than that achieved during the incremental test (nonobese = 1.57 ± 0.27 L·min and obese = 1.84 ± 0.48 L·min; P verification) interaction, suggesting that there was no effect of obesity on the difference between verification and incremental V˙O2max (P = 0.747). A verification test yielded significantly higher values of V˙O2max when compared with the incremental test in obese children. Similar results were observed in nonobese children. Supramaximal constant-load verification is a time-efficient and well-tolerated method for identifying the highest V˙O2 in nonobese and obese children.

  3. Confiabilidade do teste de salto vertical com 4 séries de 15 segundos Fiabilidad de la prueba de salto vertical con 4 serie de 15 segundos Reliability of the four series 15-second vertical jumping test

    Directory of Open Access Journals (Sweden)

    Jefferson Eduardo Hespanhol


    Full Text Available OBJETIVO: O objetivo deste estudo foi verificar a confiabilidade do teste e reteste do teste salto vertical com quatro séries de 15 segundos (TSVI. MÉTODO: Dezoito atletas do sexo masculino, divididos em 11 handebolistas (25,74 ± 4,71 anos; 85,84 ± 7,63kg; 182,14 ± 3,46cm e sete basquetebolistas (18,60 ± 0,77 anos; 83,32 ± 10,02kg; 188,14 ± 5,76cm foram os voluntários desse estudo. As variáveis estudadas para o teste e reteste foram o pico de potência (PP, potência média (PM, índice de fadiga (IF. Os desempenhos dessas variáveis foram mensurados através do teste de salto vertical com quatro séries de 15 segundos com 10 segundos de recuperação entre as séries. O tratamento estatístico foi realizado através da técnica descritiva e do coeficiente de correlação intraclasse (CCI. RESULTADOS: Os resultados demonstraram um alto CCI nas medidas repetidas em dias diferentes para todas as variáveis: PP (R = 0,992; p = 0,0360; PM (R = 0,993; p = 0,0107 e IF (R = 0,981; p = 0,0556; além disso, indicaram altos coeficientes de correlações entre teste e reteste para os indicadores de qualidade nas medidas da técnica de salto vertical com contramovimento sem auxílio dos membros superiores (CMJ (R = 0,991; p = 0,0800, nos números de saltos em um trabalho de 15 e 60 segundos (NSV15s, R = 0,936; p = 0,0062 e NSV60s, R = 0,978; p = 0,0139 e na altura saltada, em um trabalho de 15 e 60 segundos (SV15s, R = 0,993; p = 0,0467; e SV60s, R = 0,988; p = 0,0014. CONCLUSÃO: A análise dos dados aponta para a existência de uma medida confiável do TSVI na estimativa da resistência de força explosiva através das variáveis PM e IF.OBJETIVO: El objetivo de este estudio era verificar la fiabilidad de la prueba y retesteo de la prueba el salto vertical con cuatro series de 15 segundos (TSVI. MÉTODO: Dieciocho atletas varones, divididos en once handbolistas (25,74 ± 4,71 años; 85,84 ± 7,63 kg; 182,14 ± 3,46 centímetros y siete

  4. Structural verification of an aged composite reflector (United States)

    Lou, Michael C.; Tsuha, Walter S.


    A structural verification program applied to qualifying two heritage composite antenna reflectors for flight on the TOPEX satellite is outlined. The verification requirements and an integrated analyses/test approach employed to meet these requirements are described. Structural analysis results and qualification vibration test data are presented and discussed. It was determined that degradation of the composite and bonding materials caused by long-term exposure to an uncontrolled environment had not severely impaired the integrity of the reflector structures. The reflectors were assessed to be structurally adequate for the intended TOPEX application.

  5. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus


    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  6. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server



    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  7. Certification and verification for calmac flat plate solar collector

    Energy Technology Data Exchange (ETDEWEB)


    This document contains information used in the certification and verification of the Calmac Flat Plate Collector. Contained are such items as test procedures and results, information on materials used, Installation, Operation, and Maintenance Manuals, and other information pertaining to the verification and certification.

  8. Grip-pattern verification for a smart gun

    NARCIS (Netherlands)

    Shang, X.; Groenland, J.P.J.; Groenland, J.P.J.; Veldhuis, Raymond N.J.

    In the biometric verification system of a smart gun, the rightful user of the gun is recognized based on grip-pattern recognition. It was found that the verification performance of grip-pattern recognition degrades strongly when the data for training and testing the classifier, respectively, have

  9. Report of the subpanel on methods of verification (United States)


    A program to improve the state of understanding and of the meaning of verification and the application of verification procedures to a variety of sensor systems is recommended. The program would involve an experimental hands-on data demonstration and evaluation of those procedures in a controlled test bed experiment.

  10. Monitoring and verification R&D

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, Joseph F [Los Alamos National Laboratory; Budlong - Sylvester, Kory W [Los Alamos National Laboratory; Fearey, Bryan L [Los Alamos National Laboratory


    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  11. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement. (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L


    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  12. Mortality among military participants at the 1957 PLUMBBOB nuclear weapons test series and from leukemia among participants at the SMOKY test. (United States)

    Caldwell, Glyn G; Zack, Matthew M; Mumma, Michael T; Falk, Henry; Heath, Clark W; Till, John E; Chen, Heidi; Boice, John D


    Health effects following low doses of ionizing radiation are uncertain. Military veterans at the Nevada test site (NTS) during the SMOKY atmospheric nuclear weapons test in 1957 were reported to be at increased risk for leukemia in 1979, but this increase was not evaluated with respect to radiation dose. The SMOKY test was one of 30 tests in 1957 within the PLUMBBOB test series. These early studies led to public laws where atomic veterans could qualify for compensation for presumptive radiogenic diseases. A retrospective cohort study was conducted of 12219 veterans at the PLUMBBOB test series, including 3020 at the SMOKY nuclear test. Mortality follow-up was through 2010 and observed causes of death were compared with expected causes based on general population rates. Radiation dose to red bone marrow was based on individual dose reconstructions, and Cox proportional hazards models were used to evaluate dose response for all leukemias other than chronic lymphocytic leukemia (non-CLL leukemia). Vital status was determined for 95.3% of the 12 219 veterans. The dose to red bone marrow was low (mean 3.2 mGy, maximum 500 mGy). Military participants at the PLUMBBOB nuclear test series remained relatively healthy after 53 years and died at a lower rate than the general population. In contrast, and in comparison with national rates, the SMOKY participants showed significant increases in all causes of death, respiratory cancer, leukemia, nephritis and nephrosis, and accidents, possibly related in part to lifestyle factors common to enlisted men who made up 81% of the SMOKY cohort. Compared with national rates, a statistically significant excess of non-CLL leukemia was observed among SMOKY participants (Standardized Mortality Ratio  =  1.89, 95% 1.24-2.75, n  =  27) but not among PLUMBBOB participants after excluding SMOKY (SMR  =  0.87, 95% 0.64-1.51, n  =  47). Leukemia risk, initially reported to be significantly increased among SMOKY

  13. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.


    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  14. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy


    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  15. Turbulence Modeling Verification and Validation (United States)

    Rumsey, Christopher L.


    steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  16. Cognitive Bias in Systems Verification (United States)

    Larson, Steve


    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. Writer identification and verification

    NARCIS (Netherlands)

    Schomaker, Lambert; Ratha, N; Govindaraju, V


    Writer identification and verification have gained increased interest recently, especially in the fields of forensic document examination and biometrics. Writer identification assigns a handwriting to one writer out of a set of writers. It determines whether or not a given handwritten text has in

  18. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael


    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  19. The effects of an educational meeting and subsequent computer reminders on the ordering of laboratory tests by rheumatologists: an interrupted time series analysis

    NARCIS (Netherlands)

    Lesuis, Nienke; den Broeder, Nathan; Boers, Nadine; Piek, Ester; Teerenstra, Steven; Hulscher, Marlies; van Vollenhoven, Ronald; den Broeder, Alfons A.


    To examine the effects of an educational meeting and subsequent computer reminders on the number of ordered laboratory tests. Using interrupted time series analysis we assessed whether trends in the number of laboratory tests ordered by rheumatologists between September 2012 and September 2015 at

  20. Verification of method performance for clinical laboratories. (United States)

    Nichols, James H


    Method verification, a one-time process to determine performance characteristics before a test system is utilized for patient testing, is often confused with method validation, establishing the performance of a new diagnostic tool such as an internally developed or modified method. A number of international quality standards (International Organization for Standardization (ISO) and Clinical Laboratory Standards Institute (CLSI)), accreditation agency guidelines (College of American Pathologists (CAP), Joint Commission, U.K. Clinical Pathology Accreditation (CPA)), and regional laws (Clinical Laboratory Improvement Amendments of 1988 (CLIA'88)) exist describing the requirements for method verification and validation. Consumers of marketed test kits should verify method accuracy, precision, analytic measurement range, and the appropriateness of reference intervals to the institution's patient population. More extensive validation may be required for new methods and those manufacturer methods that have been modified by the laboratory, including analytic sensitivity and specificity. This manuscript compares the various recommendations for method verification and discusses the CLSI evaluation protocols (EP) that are available to guide laboratories in performing method verification experiments.

  1. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)


    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  2. A Tutorial on Text-Independent Speaker Verification

    Directory of Open Access Journals (Sweden)

    Frédéric Bimbot


    Full Text Available This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  3. Quantitative reactive modeling and verification. (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.


    The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...

  5. Visual inspection for CTBT verification

    Energy Technology Data Exchange (ETDEWEB)

    Hawkins, W.; Wohletz, K.


    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  6. Historical estimates of external gamma exposure and collective external gamma exposure from testing at the Nevada Test Site. I. Test series through HARDTACK II, 1958

    Energy Technology Data Exchange (ETDEWEB)

    Anspaugh, L.R.; Church, B.W.


    In 1959, the Test Manager's Committee to Establish Fallout Doses calculated estimated external gamma exposure at populated locations based upon measurements of external gamma-exposure rate. Using these calculations and estimates of population, we have tabulated the collective estimated external gamma exposures for communities within established fallout patterns. The total collective estimated external gamma exposure is 85,000 person-R. The greatest collective exposures occurred in three general areas: Saint George, Utah; Ely, Nevada; and Las Vegas, Nevada. Three events, HARRY (May 19, 1953), BEE (March 22, 1955), and SMOKY (August 31, 1957), accounted for over half of the total collective estimated external gamma exposure. The bases of the calculational models for external gamma exposure of ''infinite exposure,'' ''estimated exposure,'' and ''one year effective biological exposure'' are explained. 4 figs., 7 tabs.

  7. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)



    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  8. GRAVITY Science Verification (United States)

    Mérand, A.; Berger, J.-P.; de Wit, W.-J.; Eisenhauer, F.; Haubois, X.; Paumard, T.; Schoeller, M.; Wittkowski, M.; Woillez, J.; Wolff, B.


    In the time between successfully commissioning an instrument and before offering it in the Call for Proposals for the first time, ESO gives the community at large an opportunity to apply for short Science Verification (SV) programmes. In 2016, ESO offered SV time for the second-generation Very Large Telescope Interferometer instrument GRAVITY. In this article we describe the selection process, outline the range of science cases covered by the approved SV programmes, and highlight some of the early scientific results.

  9. Contact allergy to acrylates/methacrylates in the acrylate and nail acrylics series in southern Sweden: simultaneous positive patch test reaction patterns and possible screening allergens. (United States)

    Teik-Jin Goon, Anthony; Bruze, Magnus; Zimerson, Erik; Goh, Chee-Leok; Isaksson, Marléne


    In a recent study we showed that all our dental personnel/patients were detected with 2-hydroxyethyl methacrylate (2-HEMA) and 2,2-bis[4-(2-hydroxy-3-methacryloxypropoxy)phenyl]propane (bis-GMA). We studied 90 patients tested to the acrylate and nail acrylics series at our department over a 10 year period to see whether screening allergens could be found. Patch testing with an acrylate and nail acrylics series was performed. Among the 10 acrylate/methacrylate-allergic occupational dermatitis patients tested to the acrylate series, the most common allergens were triethyleneglycol diacrylate (TREGDA, 8), diethyleneglycol diacrylate (5), and 1,4-butanediol diacrylate (BUDA, 5). All 10 of these patients would have been picked up by a short screening series combining TREGDA, 2-hydroxypropyl methacrylate (2-HPMA), and BUDA or 1,6-hexanediol diacrylate (HDDA). Among the 14 acrylate/methacrylate-allergic nail patients, the most common allergens were ethylene glycol dimethacrylate (EGDMA, 11), 2-HEMA, (9), and triethyleneglycol dimethacrylate (9). Screening for 3 allergens i.e. 2-HEMA plus EGDMA plus TREGDA, would have detected all 14 nail patients. A short screening series combining 2-HEMA, EGDMA, TREGDA, 2-HPMA, bis-GMA, and BUDA or HDDA would have picked up all our past study patients (dental, industrial, and nail) with suspected allergy to acrylate/methacrylate allergens.

  10. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty (United States)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias


    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  11. A methodology for model-based development and automated verification of software for aerospace systems (United States)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  12. Change detection in a time series of polarimetric SAR data by an omnibus test statistic and its factorization (Conference Presentation) (United States)

    Nielsen, Allan A.; Conradsen, Knut; Skriver, Henning


    Test statistics for comparison of real (as opposed to complex) variance-covariance matrices exist in the statistics literature [1]. In earlier publications we have described a test statistic for the equality of two variance-covariance matrices following the complex Wishart distribution with an associated p-value [2]. We showed their application to bitemporal change detection and to edge detection [3] in multilook, polarimetric synthetic aperture radar (SAR) data in the covariance matrix representation [4]. The test statistic and the associated p-value is described in [5] also. In [6] we focussed on the block-diagonal case, we elaborated on some computer implementation issues, and we gave examples on the application to change detection in both full and dual polarization bitemporal, bifrequency, multilook SAR data. In [7] we described an omnibus test statistic Q for the equality of k variance-covariance matrices following the complex Wishart distribution. We also described a factorization of Q = R2 R3 … Rk where Q and Rj determine if and when a difference occurs. Additionally, we gave p-values for Q and Rj. Finally, we demonstrated the use of Q and Rj and the p-values to change detection in truly multitemporal, full polarization SAR data. Here we illustrate the methods by means of airborne L-band SAR data (EMISAR) [8,9]. The methods may be applied to other polarimetric SAR data also such as data from Sentinel-1, COSMO-SkyMed, TerraSAR-X, ALOS, and RadarSat-2 and also to single-pol data. The account given here closely follows that given our recent IEEE TGRS paper [7]. Selected References [1] Anderson, T. W., An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third ed. (2003). [2] Conradsen, K., Nielsen, A. A., Schou, J., and Skriver, H., "A test statistic in the complex Wishart distribution and its application to change detection in polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 41(1): 4-19, 2003. [3] Schou, J

  13. Gender verification of female athletes. (United States)

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A


    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on

  14. Formal Verification of Large Software Systems (United States)

    Yin, Xiang; Knight, John


    We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain


    Directory of Open Access Journals (Sweden)

    Zoltán Imecs


    Full Text Available Verification of statistical cloudiness estimations for Europe. The climate forcing induced by cloud cover consists one of the main doubtful aspect of climate change predictions. In the case of cloudiness even the sign of the trends are not cohesive in a given region. In this sense further investigation regarding the behavior of cloudiness are indicated. In this study a statistical estimation of total cloudiness is elaborated using the method of instrumental variables. For this analyze surface-observed monthly mean cloudiness data was applied for the period of 1973-1996. In the second part of the study the verification of results is established using an independent satellite retrieved data series for the period of 2005-2011. Based on verification can be conclude that the applied statistical estimation is able to reproduce the measured values with an RMSE 7, 3%, the difference between the measured and predicted changes of cloudiness is 1.44%, found a stronger decrease of cloudiness in real data as the estimation had indicate. The main differences between the observed and predicted value is evident in the distribution of the frequencies showing a shifting towards the lower values in observed data but not recognized in the estimated values. In the geographical distribution of estimations errors sign a difference is detected between the water surfaces and continental regions.

  16. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip


    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  17. Identifying Autocorrelation Generated by Various Error Processes in Interrupted Time-Series Regression Designs: A Comparison of AR1 and Portmanteau Tests (United States)

    Huitema, Bradley E.; McKean, Joseph W.


    Regression models used in the analysis of interrupted time-series designs assume statistically independent errors. Four methods of evaluating this assumption are the Durbin-Watson (D-W), Huitema-McKean (H-M), Box-Pierce (B-P), and Ljung-Box (L-B) tests. These tests were compared with respect to Type I error and power under a wide variety of error…


    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  19. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.


    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  20. Thoughts on Verification of Nuclear Disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, W H


    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was

  1. The LHC Main Quadrupoles during Series Fabrication

    CERN Document Server

    Tortschanoff, Theodor; Durante, M; Hagen, P; Klein, U; Krischel, D; Payn, A; Rossi, L; Schellong, B; Schmidt, P; Simon, F; Schirm, K-M; Todesco, E


    By the end of August 2005 about 320 of the 400 main LHC quadrupole magnets have been fabricated and about 220 of them assembled into their cold masses, together with corrector magnets. About 130 of them have been cold tested in their cryostats and most of the quadrupoles exceeded their nominal excitation, i.e. 12,000 A, after no more than two training quenches. During this series fabrication, the quality of the magnets and cold masses was thoroughly monitored by means of warm magnetic field measurements, of strict geometrical checking, and of various electrical verifications. A number of modifications were introduced in order to improve the magnet fabrication, mainly correction of the coil geometry for achieving the specified field quality and measures for avoiding coil insulation problems. Further changes concern the electrical connectivity and insulation of instrumentation, and of the corrector magnets inside the cold masses. The contact resistances for the bus-bar connections to the quench protection diode...

  2. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo


    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  3. Fourier series

    CERN Document Server

    Tolstov, Georgi P


    Richard A. Silverman's series of translations of outstanding Russian textbooks and monographs is well-known to people in the fields of mathematics, physics, and engineering. The present book is another excellent text from this series, a valuable addition to the English-language literature on Fourier series.This edition is organized into nine well-defined chapters: Trigonometric Fourier Series, Orthogonal Systems, Convergence of Trigonometric Fourier Series, Trigonometric Series with Decreasing Coefficients, Operations on Fourier Series, Summation of Trigonometric Fourier Series, Double Fourie

  4. Earthquake induced rock shear through a deposition hole. Modelling of three model tests scaled 1:10. Verification of the bentonite material model and the calculation technique

    Energy Technology Data Exchange (ETDEWEB)

    Boergesson, Lennart (Clay Technology AB, Lund (Sweden)); Hernelind, Jan (5T Engineering AB, Vaesteraas (Sweden))


    Three model shear tests of very high quality simulating a horizontal rock shear through a deposition hole in the centre of a canister were performed 1986. The tests and the results are described by /Boergesson 1986/. The tests simulated a deposition hole in the scale 1:10 with reference density of the buffer, very stiff confinement simulating the rock, and a solid bar of copper simulating the canister. The three tests were almost identical with exception of the rate of shear, which was varied between 0.031 and 160 mm/s, i.e. with a factor of more than 5,000 and the density of the bentonite, which differed slightly. The tests were very well documented. Shear force, shear rate, total stress in the bentonite, strain in the copper and the movement of the top of the simulated canister were measured continuously during the shear. After finished shear the equipment was dismantled and careful sampling of the bentonite with measurement of water ratio and density were made. The deformed copper 'canister' was also carefully measured after the test. The tests have been modelled with the finite element code Abaqus with the same models and techniques that were used for the full scale scenarios in SR-Site. The results have been compared with the measured results, which has yielded very valuable information about the relevancy of the material models and the modelling technique. An elastic-plastic material model was used for the bentonite where the stress-strain relations have been derived from laboratory tests. The material model is made a function of both the density and the strain rate at shear. Since the shear is fast and takes place under undrained conditions, the density is not changed during the tests. However, strain rate varies largely with both the location of the elements and time. This can be taken into account in Abaqus by making the material model a function of the strain rate for each element. A similar model, based on tensile tests on the copper used in

  5. Optical secure image verification system based on ghost imaging (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian


    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  6. An Evaluation of a Two-Stage Testlet Design for Computerized Testing. Law School Admission Council Computerized Testing Report. LSAC Research Report Series. (United States)

    Reese, Lynda M.; Schnipke, Deborah L.

    A two-stage design provides a way of roughly adapting item difficulty to test-taker ability. All test takers take a parallel stage-one test, and based on their scores, they are routed to tests of different difficulty levels in the second stage. This design provides some of the benefits of standard computer adaptive testing (CAT), such as increased…


    This report is on an environmental verification of the emissions characteristics of a Donaldson Corp. catalytic muffler and catalyic crankcase emissions control. It was found the systems reduced emissions.

  8. Mechanistic Model for Ash Deposit Formation in Biomass Suspension-Fired Boilers. Part 2: Model Verification by Use of Full Scale Tests

    DEFF Research Database (Denmark)

    Hansen, Stine Broholm; Jensen, Peter Arendt; Jappe Frandsen, Flemming


    A model for deposit formation in suspension firing of biomass has been developed. The model describes deposit build-up by diffusion and subsequent condensation of vapors, thermoforesis of aerosols, convective diffusion of small particles, impaction of large particles and reaction. The model...... describes particle sticking or rebound by a combination of the description of (visco)elsatic particles impacting a solid surface and particle capture by a viscous surface. The model is used to predict deposit formation rates measured during tests conducted with probes in full-scale suspension-fired biomass...... boilers. The rates predicted by the model was reasonably able to follow the rates observed in the tests, although with some variation, primarily as overestimations of the deposit formation rates. It is considered that the captive properties of the deposit surface are overestimated. Further examination...

  9. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M


    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  10. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva


    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: [2]:

  11. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)


    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  12. Overview of Code Verification (United States)


    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  13. Nearest-Neighbor Estimation for ROC Analysis under Verification Bias. (United States)

    Adimari, Gianfranco; Chiogna, Monica


    For a continuous-scale diagnostic test, the receiver operating characteristic (ROC) curve is a popular tool for displaying the ability of the test to discriminate between healthy and diseased subjects. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the test result and other characteristics of the subjects. Estimators of the ROC curve based only on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias, in particular under the assumption that the true disease status, if missing, is missing at random (MAR). MAR assumption means that the probability of missingness depends on the true disease status only through the test result and observed covariate information. However, the existing methods require parametric models for the (conditional) probability of disease and/or the (conditional) probability of verification, and hence are subject to model misspecification: a wrong specification of such parametric models can affect the behavior of the estimators, which can be inconsistent. To avoid misspecification problems, in this paper we propose a fully nonparametric method for the estimation of the ROC curve of a continuous test under verification bias. The method is based on nearest-neighbor imputation and adopts generic smooth regression models for both the probability that a subject is diseased and the probability that it is verified. Simulation experiments and an illustrative example show the usefulness of the new method. Variance estimation is also discussed.

  14. Round-Robin Verification and Final Development of the IEC 62788-1-5 Encapsulation Size Change Test; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Wohlgemuth, J.; Bokria, J.; Gu, X.; Honeker, C.; Murua, N.; Nickel, N.; Sakurai, K.; Shioda, T.; Tamizhmani, G.; Wang, E.; Yang, S.; Yoshihara, T.


    Polymeric encapsulation materials may a change size when processed at typical module lamination temperatures. The relief of residual strain, trapped during the manufacture of encapsulation sheet, can affect module performance and reliability. For example, displaced cells and interconnects threaten: cell fracture; broken interconnects (open circuits and ground faults); delamination at interfaces; and void formation. A standardized test for the characterization of change in linear dimensions of encapsulation sheet has been developed and verified. The IEC 62788-1-5 standard quantifies the maximum change in linear dimensions that may occur to allow for process control of size change. Developments incorporated into the Committee Draft (CD) of the standard as well as the assessment of the repeatability and reproducibility of the test method are described here. No pass/fail criteria are given in the standard, rather a repeatable protocol to quantify the change in dimension is provided to aid those working with encapsulation. The round-robin experiment described here identified that the repeatability and reproducibility of measurements is on the order of 1%. Recent refinements to the test procedure to improve repeatability and reproducibility include: the use of a convection oven to improve the thermal equilibration time constant and its uniformity; well-defined measurement locations reduce the effects of sampling size -and location- relative to the specimen edges; a standardized sand substrate may be readily obtained to reduce friction that would otherwise complicate the results; specimen sampling is defined, so that material is examined at known sites across the width and length of rolls; and encapsulation should be examined at the manufacturer’s recommended processing temperature, except when a cross-linking reaction may limit the size change. EVA, for example, should be examined 100 °C, between its melt transition (occurring up to 80 °C) and the onset of cross

  15. ICUD-0420 Testing high resolution synthetic rainfall time series representing current and future climates on CSO and other indicators

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Davidsen, S.; Löwe, Roland


    Combined sewer systems have a long technical lifetime, thus climate change must be taken into consideration when designing new CSO structures, basins, and pipe system enhancements. At the same time, the performance is highly dependent on antecedent conditions in the sewer system and is therefore...... best modelled using LTS. In the present study, we calculate indicators related to CSO statistics using synthetic time series created with different methodologies for both present and future climatic conditions. The methodology for synthetic rainfall generation influences the obtained results along...

  16. Functions of social support and self-verification in association with loneliness, depression, and stress. (United States)

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny


    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  17. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    Directory of Open Access Journals (Sweden)

    Njeh Christopher F


    Full Text Available Abstract Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID or computed radiography (CR. We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence.

  18. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian


    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  19. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    methods in the verification task. Today formal verification is finding increasing acceptance ... approaches that are major research issues in formal verification research today. There are four articles in this issue, which show up the different flavours in the approach to formal methods in verification. The first paper by Supratik ...

  20. Verification methodology manual for SystemVerilog

    CERN Document Server

    Bergeron, Janick; Hunter, Alan


    SystemVerilog is a unified language that serves both design and verification engineers by including RTL design constructs, assertions and a rich set of verification constructs. This book is based upon best verification practices by ARM, Synopsys and their customers. It is useful for those involved in the design or verification of a complex chip.

  1. Performance verification and system integration tests of the pulse shape processor for the soft x-ray spectrometer onboard ASTRO-H (United States)

    Takeda, Sawako; Tashiro, Makoto S.; Ishisaki, Yoshitaka; Tsujimoto, Masahiro; Seta, Hiromi; Shimoda, Yuya; Yamaguchi, Sunao; Uehara, Sho; Terada, Yukikatsu; Fujimoto, Ryuichi; Mitsuda, Kazuhisa


    The soft X-ray spectrometer (SXS) aboard ASTRO-H is equipped with dedicated digital signal processing units called pulse shape processors (PSPs). The X-ray microcalorimeter system SXS has 36 sensor pixels, which are operated at 50 mK to measure heat input of X-ray photons and realize an energy resolution of 7 eV FWHM in the range 0.3-12.0 keV. Front-end signal processing electronics are used to filter and amplify the electrical pulse output from the sensor and for analog-to-digital conversion. The digitized pulses from the 36 pixels are multiplexed and are sent to the PSP over low-voltage differential signaling lines. Each of two identical PSP units consists of an FPGA board, which assists the hardware logic, and two CPU boards, which assist the onboard software. The FPGA board triggers at every pixel event and stores the triggering information as a pulse waveform in the installed memory. The CPU boards read the event data to evaluate pulse heights by an optimal filtering algorithm. The evaluated X-ray photon data (including the pixel ID, energy, and arrival time information) are transferred to the satellite data recorder along with event quality information. The PSP units have been developed and tested with the engineering model (EM) and the flight model. Utilizing the EM PSP, we successfully verified the entire hardware system and the basic software design of the PSPs, including their communication capability and signal processing performance. In this paper, we show the key metrics of the EM test, such as accuracy and synchronicity of sampling clocks, event grading capability, and resultant energy resolution.

  2. TET-1- A German Microsatellite for Technology On -Orbit Verification (United States)

    Föckersperger, S.; Lattner, K.; Kaiser, C.; Eckert, S.; Bärwald, W.; Ritzmann, S.; Mühlbauer, P.; Turk, M.; Willemsen, P.


    Due to the high safety standards in the space industry every new product must go through a verification process before qualifying for operation in a space system. Within the verification process the payload undergoes a series of tests which prove that it is in accordance with mission requirements in terms of function, reliability and safety. Important verification components are the qualification for use on the ground as well as the On-Orbit Verification (OOV), i.e. proof that the product is suitable for use under virtual space conditions (on-orbit). Here it is demonstrated that the product functions under conditions which cannot or can only be partially simulated on the ground. The OOV-Program of the DLR serves to bridge the gap between the product tested and qualified on the ground and the utilization of the product in space. Due to regular and short-term availability of flight opportunities industry and research facilities can verify their latest products under space conditions and demonstrate their reliability and marketability. The Technologie-Erprobungs-Tr&äger TET (Technology Experiments Carrier) comprises the core elements of the OOV Program. A programmatic requirement of the OOV Program is that a satellite bus already verified in orbit be used in the first segment of the program. An analysis of suitable satellite buses showed that a realization of the TET satellite bus based on the BIRD satellite bus fulfilled the programmatic requirements best. Kayser-Threde was selected by DLR as Prime Contractor to perform the project together with its major subcontractors Astro- und Feinwerktechnik, Berlin for the platform development and DLR-GSOC for the ground segment development. TET is now designed to be a modular and flexible micro-satellite for any orbit between 450 and 850 km altitude and inclination between 53° and SSO. With an overall mass of 120 kg TET is able to accommodate experiments of up to 50 kg. A multipurpose payload supply systemThere is

  3. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations (United States)

    Cornford, S.; Gibbel, M.


    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  4. Trends in patch-test results and allergen changes in the standard series: a Mayo Clinic 5-year retrospective review (January 1, 2006, to December 31, 2010). (United States)

    Wentworth, Ashley B; Yiannias, James A; Keeling, James H; Hall, Matthew R; Camilleri, Michael J; Drage, Lisa A; Torgerson, Rochelle R; Fett, Debra D; Prakash, Amy V; Scalf, Leigh Ann; Allen, Eve M; Johnson, Janis S; Singh, Nidhi; Nordberg Linehan, Diane L; Killian, Jill M; Davis, Mark D P


    Patch testing is essential for identification of culprits causing allergic contact dermatitis. We sought to identify trends and allergen changes in our standard series during 2006 to 2010, compared with our previous report (2001-2005). We conducted a retrospective review of patch-test results. A total of 3115 patients were tested with a mean of 73.0 allergens. Since our prior report, 8 allergens were added to the standard series; 14 were deleted. Significantly higher rates of allergic positive reaction were documented for carba mix, 3%, and Disperse Orange 3, 1%. Rates were lower for 10 allergens: neomycin sulfate, 20%; gold sodium thiosulfate, 0.5%; hexahydro-1,3,5-tris(2-hydroxyethyl)triazine, 1%; disperse blue 124, 1%; disperse blue 106, 1%; diazolidinyl urea, 1%; hexylresorcinol, 0.25%; diazolidinyl urea, 1% aqueous; 2-bromo-2-nitropropane-1,3-diol, 0.25%; and lidocaine, 5%. Many final patch-test readings for many allergens were categorized as mild reactions (erythema only). Overall allergenicity and irritancy rates declined significantly since our prior report. Results were generally comparable with those in a North American Contact Dermatitis Group report from 2005 to 2006. This was a retrospective study; there is a lack of long-term follow-up. Since our previous report, our standard series composition has changed, and overall rates of allergenicity and irritancy have decreased. Notably, many final patch-test readings showed mild reactions. Copyright © 2013 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  5. SERI Desiccant Cooling Test Facility. Status report. Preliminary data on the performance of a rotary parallel-passage silica-gel dehumidifier

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, K.J.


    This report describes the SERI Desiccant Cooling Test Facility. The facility can test bench-scale rotary dehumidifiers over a wide range of controlled conditions. We constructed and installed in the test loop a prototype parallel-passage rotary dehumidifier that has spirally wound polyester tape coated with silica gel. The initial tests gave satisfactory results indicating that approximately 90% of the silica gel was active and the overall Lewis number of the wheel was near unity. The facility has several minor difficulties including an inability to control humidity satisfactorily and nonuniform and highly turbulent inlet velocities. To completely validate the facility requires a range of dehumidifier designs. Several choices are available including constructing a second parallel-passage dehumidifier with the passage spacing more uniform.

  6. Intelligence Testing and Minority Students: Foundations, Performance Factors, and Assessment Issues. Racial and Ethnic Minority Psychology Series. (United States)

    Valencia, Richard R.; Suzuki, Lisa A.

    This book examines intelligence assessment among ethnic minority children. Part 1, "Foundations," includes: (1) "Historical Issues" (e.g., emergence of intelligence testing in Europe and ideology of the intelligence testing movement); and (2) "Multicultural Perspective of Intelligence: Theory and Measurement Issues"…

  7. Numident Online Verification Utility (NOVU) (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  8. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer


    to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow....... It is shown that dual decomposition can be applied on the problem of generating barrier certificates, resulting in a compositional formulation of the safety verification problem. This makes the barrier certificate method applicable to the verification of high dimensional systems, but at the cost...

  9. AGARD Flight Test Series. Volume 10. Weapon Delivery Analysis and Ballistic Flight Testing (L’Analyse du Largage d’Armes et les en Vol Balistique). (United States)


    de travail sur les techniques des essais en vol a ete recree pour mener ä bien cette täche. Les monographies dans cette serie (a l’exception de la ce volume figure ci-dessous. LAGARD peut etre tier que ces personnes competentes aient bien voulu accepter de partager leurs connaissances et...novations dans le domaine des appareils de mesure pour les essais en vol, ont conduit ä recreer, en 1968, le groupe de travail sur les appareils de

  10. Structured Programming Series. Volume 15. Validation and Verification Study (United States)


    inspection is the moderator. He need not be a technical expert, but he must manage the inspection team and provide leadership . He must use The algorithm timer is rost often used by the designers of products. There are several other good descriptions of this technique referenced in...Moderator-The key person in jucc.ssful inspection. He need not be a technical expert, but he must manage the inspection team and offer leadership . He

  11. Development and verification of an agent-based model of opinion leadership. (United States)

    Anderson, Christine A; Titler, Marita G


    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  12. Two-phase flow experiments on Counter-Current Flow Limitation in a model of the hot leg of a pressurized water reactor (2015 test series)

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, Matthias; Lucas, Dirk; Pietruske, Heiko; Szalinski, Lutz


    Counter-Current Flow Limitation (CCFL) is of importance for PWR safety analyses in several accident scenarios connected with loss of coolant. Basing on the experiences obtained during a first series of hot leg tests now new experiments on counter-current flow limitation were conducted in the TOPFLOW pressure vessel. The test series comprises air-water tests at 1 and 2 bar as well as steam-water tests at 10, 25 and 50 bar. During the experiments the flow structure was observed along the hot leg model using a high-speed camera and web-cams. In addition pressure was measured at several positions along the horizontal part and the water levels in the reactor-simulator and steam-generator-simulator tanks were determined. This report documents the experimental setup including the description of operational and special measuring techniques, the experimental procedure and the data obtained. From these data flooding curves were obtained basing on the Wallis parameter. The results show a slight shift of the curves in dependency of the pressure. In addition a slight decrease of the slope was found with increasing pressure. Additional investigations concern the effects of hysteresis and the frequencies of liquid slugs. The latter ones show a dependency on pressure and the mass flow rate of the injected water. The data are available for CFD-model development and validation.

  13. Biometric verification of a subject through eye movements. (United States)

    Juhola, Martti; Zhang, Youming; Rasku, Jyrki


    Matching digital fingerprint, face or iris images, biometric verification of persons has advanced. Notwithstanding the progress, this is no easy computational task because of great numbers of complicated data. Since the 1990s, eye movements previously only applied to various tests of medicine and psychology are also studied for the purpose of computer interfaces. Such a short one-dimensional measurement signal contains less data than images and may therefore be simpler and faster to recognize. Using saccadic eye movements we developed a computational verification method to reliably distinguish a legitimate person or a subject in general from others. We tested features extracted from signals recorded from saccade eye movements. We used saccades of 19 healthy subjects and 21 otoneurological patients recorded with electro-oculography and additional 40 healthy subjects recorded with a videocamera system. Verification tests produced high accuracies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)


    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  15. Operational Overview for Unmanned Aircraft Systems (UAS) Integration in the National Airspace (NAS) Project Flight Test Series 3 (United States)

    Valkov, Steffi


    This presentation is a high level overview of the flight testing that took place in 2015 for the UAS-NAS project. All topics in the presentation discussed at a high level and no technical details are provided.

  16. Global survey of malaria rapid diagnostic test (RDT) sales, procurement and lot verification practices: assessing the use of the WHO-FIND Malaria RDT Evaluation Programme (2011-2014). (United States)

    Incardona, Sandra; Serra-Casas, Elisa; Champouillon, Nora; Nsanzabana, Christian; Cunningham, Jane; González, Iveth J


    Malaria rapid diagnostic tests (RDTs) play a critical role in malaria case management, and assurance of quality is a key factor to promote good adherence to test results. Since 2007, the World Health Organization (WHO) and the Foundation for Innovative New Diagnostics (FIND) have coordinated a Malaria RDT Evaluation Programme, comprising a pre-purchase performance evaluation (product testing, PT) and a pre-distribution quality control of lots (lot testing, LT), the former being the basis of WHO recommendations for RDT procurement. Comprehensive information on malaria RDTs sold worldwide based on manufacturers' data and linked to independent performance data is currently not available, and detailed knowledge of procurement practices remains limited. The use of the PT/LT Programme results as well as procurement and lot verification practices were assessed through a large-scale survey, gathering product-specific RDT sales and procurement data (2011-14 period) from a total of 32 manufacturers, 12 procurers and 68 National Malaria Control Programmes (NMCPs). Manufacturers' reports showed that RDT sales had more than doubled over the four years, and confirmed a trend towards increased compliance with the WHO procurement criteria (from 83% in 2011 to 93% in 2014). Country-level reports indicated that 74% of NMCPs procured only 'WHO-compliant' RDT products, although procurers' transactions datasets revealed a surprisingly frequent overlap of different products and even product types (e.g., Plasmodium falciparum-only and Plasmodium-pan) in the same year and country (60 and 46% of countries, respectively). Importantly, the proportion of 'non-complying' (i.e., PT low scored or not evaluated) products was found to be higher in the private health care sector than in the public sector (32% vs 5%), and increasing over time (from 22% of private sector sales in 2011 to 39% in 2014). An estimated 70% of the RDT market was covered by the LT programme. The opinion about the PT

  17. Flying Qualities Flight Testing of Digital Flight Control Systems. Flight Test Techniques Series - Volume 21 (les Essais en vol des performances des systemes de ommande de vol numeriques) (United States)


    the control paths. These build tests utilize an automated ground test facility known as the Automatic Test Equipment (ATE), which contains its leur sécurité. Note de traduction : l’auteur insiste lourdement dans le 2ème paragraphe sur la préparation des essais et l’analyse des... automatically switched functions are operated. All onboard transmitters are exercised across their frequency range at normal and, where possible, at

  18. Solid waste operations complex engineering verification program plan

    Energy Technology Data Exchange (ETDEWEB)

    Bergeson, C.L.


    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project`s Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP.

  19. Verification of computational models of cardiac electro-physiology. (United States)

    Pathmanathan, Pras; Gray, Richard A


    For computational models of cardiac activity to be used in safety-critical clinical decision-making, thorough and rigorous testing of the accuracy of predictions is required. The field of 'verification, validation and uncertainty quantification' has been developed to evaluate the credibility of computational predictions. The first stage, verification, is the evaluation of how well computational software correctly solves the underlying mathematical equations. The aim of this paper is to introduce novel methods for verifying multi-cellular electro-physiological solvers, a crucial first stage for solvers to be used with confidence in clinical applications. We define 1D-3D model problems with exact solutions for each of the monodomain, bidomain, and bidomain-with-perfusing-bath formulations of cardiac electro-physiology, which allow for the first time the testing of cardiac solvers against exact errors on fully coupled problems in all dimensions. These problems are carefully constructed so that they can be easily run using a general solver and can be used to greatly increase confidence that an implementation is correct, which we illustrate by testing one major solver, 'Chaste', on the problems. We then perform case studies on calculation verification (also known as solution verification) for two specific applications. We conclude by making several recommendations regarding verification in cardiac modelling. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Unspecific chronic low back pain – a simple functional classification tested in a case series of patients with spinal deformities

    Directory of Open Access Journals (Sweden)

    Werkmann Mario


    Full Text Available Abstract Background Up to now, chronic low back pain without radicular symptoms is not classified and attributed in international literature as being "unspecific". For specific bracing of this patient group we use simple physical tests to predict the brace type the patient is most likely to benefit from. Based on these physical tests we have developed a simple functional classification of "unspecific" low back pain in patients with spinal deformities. Methods Between January 2006 and July 2007 we have tested 130 patients (116 females and 14 males with spinal deformities (average age 45 years, ranging from 14 years to 69 and chronic unspecific low back pain (pain for > 24 months along with the indication for brace treatment for chronic unspecific low back pain. Some of the patients had symptoms of spinal claudication (n = 16. The "sagittal realignment test" (SRT was applied, a lumbar hyperextension test, and the "sagittal delordosation test" (SDT. Additionally 3 female patients with spondylolisthesis were tested, including one female with symptoms of spinal claudication and 2 of these patients were 14 years of age and the other 43yrs old at the time of testing. Results 117 Patients reported significant pain release in the SRT and 13 in the SDT (>/= 2 steps in the Roland & Morris VRS. 3 Patients had no significant pain release in both of the tests ( Pain intensity was high (3,29 before performing the physical tests (VRS-scale 0–5 and low (1,37 while performing the physical test for the whole sample of patients. The differences where highly significant in the Wilcoxon test (z = -3,79; p In the 16 patients who did not respond to the SRT in the manual investigation we found hypermobility at L5/S1 or a spondylolisthesis at level L5/S1. In the other patients who responded well to the SRT loss of lumbar lordosis was the main issue, a finding which, according to scientific literature, correlates well with low back pain. The 3 patients who did not

  1. An omnibus likelihood test statistic and its factorization for change detection in time series of polarimetric SAR data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning


    Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated p-value and a factorization of this test statistic, change analysis in a short sequence of multilook, polarimetric SAR data...... in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change(s) occur. The technique is demonstrated on airborne EMISAR L-band data but may be applied to Sentinel-1, Cosmo-SkyMed, TerraSAR-X, ALOS and RadarSat-2 or other dual- and quad....../full-pol, and even single-pol data also....

  2. Test of 6-in. -thick pressure vessels. Series 4: intermediate test vessels V-5 and V-9 with inside nozzle corner cracks. [BWR and PWR

    Energy Technology Data Exchange (ETDEWEB)

    Merkle, J.G.; Robinson, G.C.; Holz, P.P.; Smith, J.E.


    Failure testing is described for two 99-cm-diam (39-in.), 15.2-cm-thick (6-in.) steel pressure vessels, each containing one flawed nozzle. Vessel V-5 was tested at 88/sup 0/C (190/sup 0/F) and failed by leaking without fracturing after extensive stable crack growth. Vessel V-9 was tested at 25/sup 0/C (75/sup 0/F) and failed by fracturing. Material properties measured before the tests were used for pretest and posttest fracture analyses. Test results supported by analysis indicate that inside nozzle corner cracks are not subject to plane strain under pressure loading. The preparation of inside nozzle corner cracks is described in detail. Extensive experimental data are tabulated and plotted.

  3. Infinite series

    CERN Document Server

    Hirschman, Isidore Isaac


    This text for advanced undergraduate and graduate students presents a rigorous approach that also emphasizes applications. Encompassing more than the usual amount of material on the problems of computation with series, the treatment offers many applications, including those related to the theory of special functions. Numerous problems appear throughout the book.The first chapter introduces the elementary theory of infinite series, followed by a relatively complete exposition of the basic properties of Taylor series and Fourier series. Additional subjects include series of functions and the app

  4. Design Specification and Verification

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen


    This is chapter 7 in an edited book that presents a number of issues of fundamental importance for the design of integrated hardware software products such as embedded, communication, and multimedia systems. This book is the result of a series of Ph.D. courses on hardware/software co-design held...

  5. An on-road shock and vibration response test series utilizing worst case and statistical analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Cap, J.S. [Sandia National Labs., Albuquerque, NM (US). Mechanical and Thermal Environments Dept.


    Defining the maximum expected shock and vibration responses for an on-road truck transportation environment is strongly dependent on the amount of response data that can be obtained. One common test scheme consists of measuring response data over a relatively short prescribed road course and then reviewing that data to obtain the maximum response levels. The more mathematically rigorous alternative is to collect an unbiased ensemble of response data during a long road trip. This paper compares data gathered both ways during a recent on-road certification test for a tractor trailer van being designed by Sandia.

  6. Contact allergy to (meth)acrylates in the dental series in southern Sweden: simultaneous positive patch test reaction patterns and possible screening allergens. (United States)

    Goon, Anthony T J; Isaksson, Marléne; Zimerson, Erik; Goh, Chee Leok; Bruze, Magnus


    Contact allergy to dental allergens is a well-studied subject, more so among dental professionals than dental patients. 1632 subjects had been patch tested to either the dental patient series or dental personnel series at the department of Occupational and Environmental Dermatology, Malmö, Sweden. Positive patch tests to (meth)acrylate allergens were seen in 2.3% (30/1322) of the dental patients and 5.8% (18/310) of the dental personnel. The most common allergen for both groups was 2-hydroxyethyl methacrylate (2-HEMA), followed by ethyleneglycol dimethacrylate (EGDMA), triethyleneglycol dimethacrylate, and methyl methacrylate. 47 (29 dental patients and 18 dental personnel) out of these 48 had positive patch tests to 2-HEMA. All 30 subjects who had a positive reaction to EGDMA had a simultaneous positive reaction to 2-HEMA. One dental patient reacted only to 2,2-bis[4-(2-hydroxy-3-methacryloxypropoxy) phenyl]propane (bis-GMA). From our data, screening for (meth)acrylate contact allergy with 2-HEMA alone would have picked up 96.7% (29/30) of our (meth)acrylate-allergic dental patients and 100% (18/18) of our (meth)acrylate-allergic dental personnel. The addition of bis-GMA in dental patients would increase the pick-up rate to 100%.

  7. Change detection in a time series of polarimetric SAR data by an omnibus test statistic and its factorization

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning


    in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change(s) occur. The technique is demonstrated on airborne EMISAR L-band data but may be applied to Sentinel-1, Cosmo-SkyMed, TerraSAR-X, ALOS and RadarSat-2 or other dual- and quad...

  8. Fragmentation of Solid Materials Using Shock Tubes. Part 2: First Test Series in a Large Diameter Shock Tube (United States)


    Figure 2. Typical test setup including shock tube, expansion section, sample mounting and fragment containment structure ...34 architectural precast panel (p=50-psi, t=100-msec) as predicted by LS-DYNA...overpressure airblast, creating conditions where material properties (e.g., strength, density) dominate structural response (e.g., bending, flexing

  9. A Life-Size and Near Real-Time Test of Irrigation Scheduling with a Sentinel-2 Like Time Series (SPOT4-Take5 in Morocco

    Directory of Open Access Journals (Sweden)

    Michel Le Page


    Full Text Available This paper describes the setting and results of a real-time experiment of irrigation scheduling by a time series of optical satellite images under real conditions, which was carried out on durum wheat in the Haouz plain (Marrakech, Morocco, during the 2012/13 agricultural season. For the purpose of this experiment, the irrigation of a reference plot was driven by the farmer according to, mainly empirical, irrigation scheduling while test plot irrigations were being managed following the FAO-56 method, driven by remote sensing. Images were issued from the SPOT4 (Take5 data set, which aimed at delivering image time series at a decametric resolution with less than five-day satellite overpass similar to the time series ESA Sentinel-2 satellites will produce in the coming years. With a Root Mean Square Error (RMSE of 0.91mm per day, the comparison between daily actual evapotranspiration measured by eddy-covariance and the simulated one is satisfactory, but even better at a five-day integration (0.59mm per day. Finally, despite a chaotic beginning of the experiment—the experimental plot had not been irrigated to get rid of a slaking crust, which prevented good emergence—our plot caught up and yielded almost the same grain crop with 14% less irrigation water. This experiment opens up interesting opportunities for operational scheduling of flooding irrigation sectors that dominate in the semi-arid Mediterranean area.

  10. Developing a standard method for apnea testing in the determination of brain death for patients on venoarterial extracorporeal membrane oxygenation: a pediatric case series. (United States)

    Jarrah, Rima J; Ajizian, Samuel J; Agarwal, Swati; Copus, Scott C; Nakagawa, Thomas A


    The revised guidelines for the determination of brain death in infants and children stress that apnea testing is an integral component in determining brain death based on clinical criteria. Unfortunately, these guidelines provide no process for apnea testing during the determination of brain death in patients supported on venoarterial extracorporeal membrane oxygenation. We review three pediatric patients supported on venoarterial extracorporeal membrane oxygenation who underwent apnea testing during their brain death evaluation. This is the only published report to elucidate a reliable, successful method for apnea testing in pediatric patients supported on venoarterial extracorporeal membrane oxygenation. Retrospective case series. Two tertiary care PICUs in university teaching hospitals. Three pediatric patients supported by venoarterial extracorporeal membrane oxygenation after cardiopulmonary arrest. After neurologic examinations demonstrated cessation of brain function in accordance with current pediatric brain death guidelines, apnea testing was performed on each child while supported on venoarterial extracorporeal membrane oxygenation. In two of the three cases, the patients remained hemodynamically stable with normal oxygen saturations as venoarterial extracorporeal membrane oxygenation sweep gas was weaned and apnea testing was undertaken. Apnea testing demonstrating no respiratory effort was successfully completed in these two cases. The third patient became hemodynamically unstable, invalidating the apnea test. Apnea testing on venoarterial extracorporeal membrane oxygenation can be successfully undertaken in the evaluation of brain death. We provide a suggested protocol for apnea testing while on venoarterial extracorporeal membrane oxygenation that is consistent with the updated pediatric brain death guidelines. This is the only published report to elucidate a reliable, successful method for apnea testing in pediatric patients supported on venoarterial

  11. Aerodynamic and Hydrodynamic Tests of a Family of Models of Flying Hulls Derived from a Streamline Body -- NACA Model 84 Series (United States)

    Parkinson, John B; Olson, Roland E; Draley, Eugene C; Luoma, Arvo A


    A series of related forms of flying-boat hulls representing various degrees of compromise between aerodynamic and hydrodynamic requirements was tested in Langley Tank No. 1 and in the Langley 8-foot high-speed tunnel. The purpose of the investigation was to provide information regarding the penalties in water performance resulting from further aerodynamic refinement and, as a corollary, to provide information regarding the penalties in range or payload resulting from the retention of certain desirable hydrodynamic characteristics. The information should form a basis for over-all improvements in hull form.


    The U.S. Environmental Protection Agency Air Pollution Control Technology (APCT) Verification Center evaluates the performance of baghouse filtration products used primarily to control PM2.5 emissions. This verification statement summarizes the test results for W.L. Gore & Assoc....

  13. 40 CFR 1065.341 - CVS and batch sampler verification (propane check). (United States)


    ... (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Flow-Related... engineering judgment and safe practices, this check may be performed using a gas other than propane, such as... components. (3) Poor mixing. Perform the verification as described in this section while traversing a...

  14. Seismic Surveillance. Nuclear Test Ban Verification (United States)


    as the Horn Graben. RFH appears to have been elevated in the Carboniferous while taphrogenesis has been related to the late Carboniferous - early...SCANDINAVIA: CARBONIFEROUS TO PRESENT The western part of S. Scandinavia has experienced volcanic activity as shown in Fig. 8. Paleozoic magmatism is...with age ranges from Permian to Tertiary. In summary, evidence of volcanic activity throughout Carboniferous to present have been found and is

  15. Seismic Surveillance - Nuclear Test Ban Verification (United States)


    Reports. 1215 Jefferson Oavis Highway. Suite 1204. Arlington. VA 22202-4302. and to the Office of Management and Budget . Paperwork Reduction Project (0704...shear In the lower crust below Skagerrak. Conventional rifting scenarios incorporating magmatic underplating of Moho Is not considered tenable In our...Fig. 3b). The Skagerrak block is detached along the FFZ (18) (Fig.1 and 2c) where Permian magmatic activity has been reported (29,20). Finally

  16. Built-in-Test Verification Techniques (United States)


    are currently underway to develop military applications. The field of artificial intelligence generally includes natural language processing, robotics ...because o2 its applicability being limited to analog circuit.• . This narrowed the evaluation to the 11Mf and simulation approaches. The current flEA



    M. Nekouei Shahraki; N. Haala


    The recent advances in the field of computer-vision have opened the doors of many opportunities for taking advantage of these techniques and technologies in many fields and applications. Having a high demand for these systems in today and future vehicles implies a high production volume of video cameras. The above criterions imply that it is critical to design test systems which deliver fast and accurate calibration and optical-testing capabilities. In this paper we introduce new generation o...


    Directory of Open Access Journals (Sweden)

    M. Nekouei Shahraki


    Full Text Available The recent advances in the field of computer-vision have opened the doors of many opportunities for taking advantage of these techniques and technologies in many fields and applications. Having a high demand for these systems in today and future vehicles implies a high production volume of video cameras. The above criterions imply that it is critical to design test systems which deliver fast and accurate calibration and optical-testing capabilities. In this paper we introduce new generation of test-stands delivering high calibration quality in single-shot calibration of fisheye surround-view cameras. This incorporates important geometric features from bundle-block calibration, delivers very high (sub-pixel calibration accuracy, makes possible a very fast calibration procedure (few seconds, and realizes autonomous calibration via machines. We have used the geometrical shape of a Spherical Helix (Type: 3D Spherical Spiral with special geometrical characteristics, having a uniform radius which corresponds to the uniform motion. This geometrical feature was mechanically realized using three dimensional truncated icosahedrons which practically allow the implementation of a spherical helix on multiple surfaces. Furthermore the test-stand enables us to perform many other important optical tests such as stray-light testing, enabling us to evaluate the certain qualities of the camera optical module.

  19. Code Verification by the Method of Manufactured Solutions

    Energy Technology Data Exchange (ETDEWEB)



    A procedure for code Verification by the Method of Manufactured Solutions (MMS) is presented. Although the procedure requires a certain amount of creativity and skill, we show that MMS can be applied to a variety of engineering codes which numerically solve partial differential equations. This is illustrated by detailed examples from computational fluid dynamics. The strength of the MMS procedure is that it can identify any coding mistake that affects the order-of-accuracy of the numerical method. A set of examples which use a blind-test protocol demonstrates the kinds of coding mistakes that can (and cannot) be exposed via the MMS code Verification procedure. The principle advantage of the MMS procedure over traditional methods of code Verification is that code capabilities are tested in full generality. The procedure thus results in a high degree of confidence that all coding mistakes which prevent the equations from being solved correctly have been identified.

  20. Data report on the Waste Isolation Pilot Plant Small-Scale Seal Performance Test, Series F grouting experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, E.H. [Sandia National Labs., Albuquerque, NM (United States); Dale, T.F.; Van Pelt, R.S. [INTERA, Inc., Austin, TX (United States)


    SSSPT-F was designed to evaluate sealing materials at WIPP. It demonstrated: (1) the ability to practically and consistently produce ultrafine cementitious grout at the grouting site, (2) successful, consistent, and efficient injection and permeation of the grout into fractured rock at the repository horizon, (3) ability of the grout to penetrate and seal microfractures, (4) procedures and equipment used to inject the grout. Also techniques to assess the effectiveness of the grout in reducing the gas transmissivity of the fractured rock were evaluated. These included gas-flow/tracer testing, post-grout coring, pre- and post-grout downhole televiewer logging, slab displacement measurements, and increased loading on jacks during grout injection. Pre- and post-grout diamond drill core was obtained for use in ongoing evaluations of grouting effectiveness, degradation, and compatibility. Diamond drill equipment invented for this test successfully prevented drill cuttings from plugging fractures in grout injection holes.

  1. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno


    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  2. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi


    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  3. Achievement report for fiscal 1998 on the development of superconductor power application technology. 2. Research and development of superconducting wire and superconductive power generator, research of total system, research and development of refrigeration system, and verification test; 1998 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 2. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total sytsem no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)



    The slow excitation response type power generator is studied when the rotor and stator of a 70,000kW-class model are combinedly subjected to an on-site verification test, when a good result is obtained. The rotor is disassembled for inspection, and its members are found to be sound without any problem in terms of mechanical strength. The quick excitation response type is studied when a 70,000kW model is experimentally built and subjected to an on-site verification test after a rotation and excitation test in the factory, when the pilot machine concept design is reviewed. In the study of a total system, efforts continue for the review of the model machine test method, improvement on generator design and analytical methods, development of operating methods, and the effect of its introduction into the power system. Since a He-refrigerated system is requested to exhibit high reliability for application to power equipment and to be capable of continuous long-period operation, a system having constituents with their reliability enhanced and an appropriate redundant system is developed, and a verification study is under way which will continue for more than 10,000 hours. Described also is an oil-free low-temperature turbo refrigerator. The latest quick excitation response type rotor is also tested for verification. (NEDO)

  4. Transcutaneous sacral nerve stimulation for intraoperative verification of internal anal sphincter innervation. (United States)

    Kauff, D W; Moszkowski, T; Wegner, C; Heimann, A; Hoffmann, K-P; Krüger, T B; Lang, H; Kneist, W


    The current standard for pelvic intraoperative neuromonitoring (pIONM) is based on intermittent direct nerve stimulation. This study investigated the potential use of transcutaneous sacral nerve stimulation for non-invasive verification of pelvic autonomic nerves. A consecutive series of six pigs underwent low anterior rectal resection. For transcutaneous sacral nerve stimulation, an array of ten electrodes (cathodes) was placed over the sacral foramina (S2 to S4). Anodes were applied on the back, right and left thigh, lower abdomen, and intra-anally. Stimulation using the novel method and current standard were performed at different phases of the experiments under electromyography of the autonomic innervated internal anal sphincter (IAS). Transcutaneous stimulation induced increase of IAS activity could be observed in each animal under specific cathode-anode configurations. Out of 300 tested configurations, 18 exhibited a change in the IAS activity correlated with intentional autonomic nerve damage. The damage resulted in a significant decrease of the relative area under the curve of the IAS frequency spectrum (P<.001). Comparison of the IAS spectra under transcutaneous and direct stimulation revealed no significant difference (after rectal resection: median 5.99 μV•Hz vs 7.78 μV•Hz, P=.12; after intentional nerve damage: median -0.27 μV•Hz vs 3.35 μV•Hz, P=.29). Non-invasive selective transcutaneous sacral nerve stimulation could be used for verification of IAS innervation. © 2017 John Wiley & Sons Ltd.

  5. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R


    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  6. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti


    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  7. Formal verification of mathematical software (United States)

    Sutherland, D.


    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  8. case series

    African Journals Online (AJOL)


    Conclusions: The concept of “case series” is not well defined in the literature and does not reflect a specific research design. We suggest that a case series should have more than four patients while four paitents or less should be reported individually as case reports. Key words: Case report, case series, concept analysis, ...

  9. Summary of Adsorption Capacity and Adsorption Kinetics of Uranium and Other Elements on Amidoxime-based Adsorbents from Time Series Marine Testing at the Pacific Northwest National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Gill, Gary A. [Pacific Northwest National Lab. (PNNL), Sequim, WA (United States). Marine Sciences Lab.; Kuo, Li-Jung [Pacific Northwest National Lab. (PNNL), Sequim, WA (United States). Marine Sciences Lab.; Strivens, Jonathan E. [Pacific Northwest National Lab. (PNNL), Sequim, WA (United States). Marine Sciences Lab.; Wood, Jordana R. [Pacific Northwest National Lab. (PNNL), Sequim, WA (United States). Marine Sciences Lab.; Schlafer, Nicholas J. [Pacific Northwest National Lab. (PNNL), Sequim, WA (United States). Marine Sciences Lab.; Janke, Christopher J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Das, Sadananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mayes, Richard [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Saito, Tomonori [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Suree S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsouris, Constantinos [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsouris, Costas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wai, Chien M. [Univ. of Idaho, Moscow, ID (United States); LCW Supercritical Technologies, Seattle, WA (United States); Pan, Horng-Bin [Univ. of Idaho, Moscow, ID (United States)


    The Pacific Northwest National Laboratory (PNNL) has been conducting marine testing of uranium adsorbent materials for the Fuel Resources Program, Department of Energy, Office of Nuclear Energy (DOE-NE) beginning in FY 2012. The marine testing program is being conducted at PNNL’s Marine Sciences Laboratory (MSL), located at Sequim Bay, along the coast of Washington. One of the main efforts of the marine testing program is the determination of adsorption capacity and adsorption kinetics for uranium and selected other elements (e.g. vanadium, iron, copper, nickel, and zinc) for adsorbent materials provided primarily by Oak Ridge National Laboratory (ORNL), but also includes other Fuel Resources Program participants. This report summarizes the major marine testing results that have been obtained to date using time series sampling for 42 to 56 days using either flow-through column or recirculating flume exposures. The major results are highlighted in this report, and the full data sets are appended as a series of Excel spreadsheet files. Over the four year period (2012-2016) that marine testing of amidoxime-based polymeric adsorbents was conducted at PNNL’s Marine Science Laboratory, there has been a steady progression of improvement in the 56-day adsorbent capacity from 3.30 g U/kg adsorbent for the ORNL 38H adsorbent to the current best performing adsorbent prepared by a collaboration between the University of Tennessee and ORNL to produce the adsorbent SB12-8, which has an adsorption capacity of 6.56 g U/kg adsorbent. This nearly doubling of the adsorption capacity in four years is a significant advancement in amidoxime-based adsorbent technology and a significant achievement for the Uranium from Seawater program. The achievements are evident when compared to the several decades of work conducted by the Japanese scientists beginning in the 1980’s (Kim et al., 2013). The best adsorbent capacity reported by the Japanese scientists was 3.2 g U/kg adsorbent for a

  10. Pairwise Identity Verification via Linear Concentrative Metric Learning. (United States)

    Zheng, Lilei; Duffner, Stefan; Idrissi, Khalid; Garcia, Christophe; Baskurt, Atilla


    This paper presents a study of metric learning systems on pairwise identity verification, including pairwise face verification and pairwise speaker verification, respectively. These problems are challenging because the individuals in training and testing are mutually exclusive, and also due to the probable setting of limited training data. For such pairwise verification problems, we present a general framework of metric learning systems and employ the stochastic gradient descent algorithm as the optimization solution. We have studied both similarity metric learning and distance metric learning systems, of either a linear or shallow nonlinear model under both restricted and unrestricted training settings. Extensive experiments demonstrate that with limited training pairs, learning a linear system on similar pairs only is preferable due to its simplicity and superiority, i.e., it generally achieves competitive performance on both the labeled faces in the wild face dataset and the NIST speaker dataset. It is also found that a pretrained deep nonlinear model helps to improve the face verification results significantly.

  11. Use of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments Across International Borders -Test/QA Plan (United States)

    The Environmental Technology Verification (ETV) – Environmental and Sustainable Technology Evaluations (ESTE) Program conducts third-party verification testing of commercially available technologies that may accomplish environmental program management goals. In this verification...

  12. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka


    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  13. A Formal Approach to the Verification of Networks on Chip

    Directory of Open Access Journals (Sweden)

    Schmaltz Julien


    Full Text Available Abstract The current technology allows the integration on a single die of complex systems-on-chip (SoCs that are composed of manufactured blocks (IPs, interconnected through specialized networks on chip (NoCs. IPs have usually been validated by diverse techniques (simulation, test, formal verification and the key problem remains the validation of the communication infrastructure. This paper addresses the formal verification of NoCs by means of a mechanized proof tool, the ACL2 theorem prover. A metamodel for NoCs has been developed and implemented in ACL2. This metamodel satisfies a generic correctness statement. Its verification for a particular NoC instance is reduced to discharging a set of proof obligations for each one of the NoC constituents. The methodology is demonstrated on a realistic and state-of-the-art design, the Spidergon network from STMicroelectronics.

  14. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.


    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  15. A series of low-altitude aerial radiological surveys of selected regions within Areas 3, 5, 8, 9, 11, 18, and 25 at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Colton, D.P.


    A series of low-altitude, aerial radiological surveys of selected regions within Areas 3, 5, 8, 9, 11, 18,and 25 of the Nevada Test Site was conducted from December 1996 through June 1999. The surveys were conducted for the US Department of Energy by the Remote Sensing Laboratory, located in Las Vegas, Nevada, and maintained and operated by Bechtel Nevada. The flights were conducted at a nominal altitude of 15 meters above ground level along a set of parallel flight lines spaced 23 meters apart. The purpose of these low-altitude surveys was to measure, map, and define the areas of americium-241 activity. The americium contamination will be used to determine the areas of plutonium contamination. Americium-241 activity was detected within 8 of the 11 regions. The three regions where americium-241 was not detected were in the inactive Nuclear Rocket Development Station complex in Area 25, which encompassed the Test Cell A and Test Cell C reactor test stands and the Reactor Maintenance Assembly and Disassembly facility.

  16. A microsatellite panel for triploid verification in the abalone Haliotis ...

    African Journals Online (AJOL)

    A method for ploidy verification of triploid and diploid Haliotis midae was developed using molecular microsatellite markers. In all, 30 microsatellite loci were tested in control populations. A final micro satellite multiplex consisting of seven markers were optimised and a complete protocol is reported. This protocol was ...

  17. Algebraic verification of a distributed summation algorithm


    Groote, Jan Friso; Springintveld, J.G.


    textabstractIn this note we present an algebraic verification of Segall's Propagation of Information with Feedback (PIF) algorithm. This algorithm serves as a nice benchmark for verification exercises (see [2, 13, 8]). The verification is based on the methodology presented in [7] and demonstrates its applicability to distributed algorithms.

  18. Gender verification of female Olympic athletes. (United States)

    Dickinson, Barry D; Genel, Myron; Robinowitz, Carolyn B; Turner, Patricia L; Woods, Gary L


    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Problems include invalid screening tests, failure to understand the problems of intersex, the discriminatory singling out of women based only on laboratory results, and the stigmatization and emotional trauma experienced by individuals screened positive. Genuine sex-impostors have not been uncovered by laboratory-based genetic testing; however, gender verification procedures have resulted in substantial harm to a number of women athletes born with relatively rare genetic abnormalities. Individuals with sex-related genetic abnormalities raised as females have no unfair physical advantage and should not be excluded or stigmatized, including those with 5-alpha-steroid-reductase deficiency, partial or complete androgen insensitivity, and chromosomal mosaicism. In 1990, the International Amateur Athletics Federation (IAAF) called for ending genetic screening of female athletes and in 1992 adopted an approach designed to prevent only male impostors from competing. The IAAF recommended that the "medical delegate" have the ultimate authority in all medical matters, including the authority to arrange for the determination of the gender of the competitor if that approach is judged necessary. The new policy advocated by the IAAF, and conditionally adopted by the International Olympic Committee, protects the rights and privacy of athletes while safeguarding fairness of competition, and the American Medical Association recommends that it become the permanent approach.

  19. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)



    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  20. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    followed by a detailed) layout of the various components of the system. The next step is the coding phase which usually also includes testing of the individual modules that are coded. Coding is followed by testing of the components and successful ...