WorldWideScience

Sample records for filters development verification

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IN-DRAIN TREATMENT DEVICE. HYDRO INTERNATIONAL UP-FLO™ FILTER

    Science.gov (United States)

    Verification testing of the Hydro International Up-Flo™ Filter with one filter module and CPZ Mix™ filter media was conducted at the Penn State Harrisburg Environmental Engineering Laboratory in Middletown, Pennsylvania. The Up-Flo™ Filter is designed as a passive, modular filtr...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, BAGHOUSE FILTRATION PRODUCTS, TETRATEC PTFE PRODUCTS, TETRATEX 6212 FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, FILTRATION GROUP, AEROSTAR FP-98 MINIPLEAT V-BLANK FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the AeroStar FP-98 Minipleat V-Bank Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 137 Pa clean and 348 Pa ...

  4. DNN Filter Bank Cepstral Coefficients for Spoofing Detection

    DEFF Research Database (Denmark)

    Yu, Hong; Tan, Zheng-Hua; Zhang, Yiming

    2017-01-01

    With the development of speech synthesis techniques, automatic speaker verification systems face the serious challenge of spoofing attack. In order to improve the reliability of speaker verification systems, we develop a new filter bank-based cepstral feature, deep neural network (DNN) filter bank...... cepstral coefficients, to distinguish between natural and spoofed speech. The DNN filter bank is automatically generated by training a filter bank neural network (FBNN) using natural and synthetic speech. By adding restrictions on the training rules, the learned weight matrix of FBNN is band limited...... and sorted by frequency, similar to the normal filter bank. Unlike the manually designed filter bank, the learned filter bank has different filter shapes in different channels, which can capture the differences between natural and synthetic speech more effectively. The experimental results on the ASVspoof...

  5. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  6. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data.

    Science.gov (United States)

    Sikora, Andrzej; Rodak, Aleksander; Unold, Olgierd; Klapetek, Petr

    2016-12-01

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, Andrzej, E-mail: sikora@iel.wroc.pl [Electrotechnical Institute, Division of Electrotechnology and Materials Science, M. Skłodowskiej-Curie 55/61, 50-369 Wrocław (Poland); Rodak, Aleksander [Faculty of Electronics, Wrocław University of Technology, Janiszewskiego 11/17, 50-372 Wrocław (Poland); Unold, Olgierd [Institute of Computer Engineering, Control and Robotics, Faculty of Electronics, Wrocław University of Technology, Janiszewskiego 11/17, 50-372 Wrocław (Poland); Klapetek, Petr [Czech Metrology Institute, Okružní 31, 638 00 Brno (Czech Republic)

    2016-12-15

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. - Highlights: • A novel approach to 2D wavelet-based filter for atomic force microscopy is shown. • The additional AFM measurement signal is used to adjust the filter. • Efficient removal of the local interference phenomena caused artifacts is presented.

  8. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data

    International Nuclear Information System (INIS)

    Sikora, Andrzej; Rodak, Aleksander; Unold, Olgierd; Klapetek, Petr

    2016-01-01

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. - Highlights: • A novel approach to 2D wavelet-based filter for atomic force microscopy is shown. • The additional AFM measurement signal is used to adjust the filter. • Efficient removal of the local interference phenomena caused artifacts is presented.

  9. Development of evaluation method for hydraulic behavior in Venturi scrubber for filtered venting

    International Nuclear Information System (INIS)

    Horiguchi, Naoki; Nakao, Yasuhiro; Kaneko, Akiko; Abe, Yutaka; Yoshida, Hiroyuki

    2016-01-01

    Filtered venting systems have been installed to restart Nuclear Power Plants in Japan after Fukushima Daiichi Nuclear Disaster. Venturi scrubber is main component of one of the systems. To evaluate decontamination performance of the Venturi scrubber for filtered venting, mechanistic evaluation method for hydrodynamic behavior is important. In this paper, our objective is to develop the method. As approaches, we conducted experimental observation under adiabatic (air-water) condition, developed a numerical simulation code with one-dimensional two-fluid model and made verification and validation by comparison between these results in terms of superficial gas, static pressure, superficial liquid velocity, droplet ratio and droplet diameter in Venturi scrubber. As results, we observed the hydrodynamic behavior, developed the code and confirmed that it has capability to evaluate the parameters with following accuracy, superficial gas velocity with +30%, static pressure in throat part with +-10%, superficial liquid velocity with +-80%, droplet diameter with +-30% and droplet ratio with -50%. (author)

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF MOBILE SOURCE EMISSIONS CONTROL DEVICES: CLEAN DIESEL TECHNOLOGIES FUEL-BORNE CATALYST WITH MITSUI/PUREARTH CATALYZED WIRE MESH FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Fuel-Borne Catalyst with Mitsui/PUREarth Catalyzed Wire Mesh Filter manufactured by Clean Diesel Technologies, Inc. The technology is a platinum/cerium fuel-borne catalyst in commerci...

  11. Design and experimental verification of a dual-band metamaterial filter

    Science.gov (United States)

    Zhu, Hong-Yang; Yao, Ai-Qin; Zhong, Min

    2016-10-01

    In this paper, we present the design, simulation, and experimental verification of a dual-band free-standing metamaterial filter operating in a frequency range of 1 THz-30 THz. The proposed structure consists of periodically arranged composite air holes, and exhibits two broad and flat transmission bands. To clarify the effects of the structural parameters on both resonant transmission bands, three sets of experiments are performed. The first resonant transmission band shows a shift towards higher frequency when the side width w 1 of the main air hole is increased. In contrast, the second resonant transmission band displays a shift towards lower frequency when the side width w 2 of the sub-holes is increased, while the first resonant transmission band is unchanged. The measured results indicate that these resonant bands can be modulated individually by simply optimizing the relevant structural parameters (w 1 or w 2) for the required band. In addition, these resonant bands merge into a single resonant band with a bandwidth of 7.7 THz when w 1 and w 2 are optimized simultaneously. The structure proposed in this paper adopts different resonant mechanisms for transmission at different frequencies and thus offers a method to achieve a dual-band and low-loss filter. Project supported by the Doctorate Scientific Research Foundation of Hezhou University, China (Grant No. HZUBS201503), the Promotion of the Basic Ability of Young and Middle-aged Teachers in Universities Project of Guangxi Zhuang Autonomous Region, China (Grant No. KY2016YB453), the Guangxi Colleges and Universities Key Laboratory Symbolic Computation, China, Engineering Data Processing and Mathematical Support Autonomous Discipline Project of Hezhou University, China (Grant No. 2016HZXYSX01).

  12. Development of Real Time Implementation of 5/5 Rule based Fuzzy Logic Controller Shunt Active Power Filter for Power Quality Improvement

    Science.gov (United States)

    Puhan, Pratap Sekhar; Ray, Pravat Kumar; Panda, Gayadhar

    2016-12-01

    This paper presents the effectiveness of 5/5 Fuzzy rule implementation in Fuzzy Logic Controller conjunction with indirect control technique to enhance the power quality in single phase system, An indirect current controller in conjunction with Fuzzy Logic Controller is applied to the proposed shunt active power filter to estimate the peak reference current and capacitor voltage. Current Controller based pulse width modulation (CCPWM) is used to generate the switching signals of voltage source inverter. Various simulation results are presented to verify the good behaviour of the Shunt active Power Filter (SAPF) with proposed two levels Hysteresis Current Controller (HCC). For verification of Shunt Active Power Filter in real time, the proposed control algorithm has been implemented in laboratory developed setup in dSPACE platform.

  13. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL AND PARTICULATE CONTAMINANTS IN DRINKING WATER : SEPARMATIC™ FLUID SYSTEMS DIATOMACEOUS EARTH PRESSURE TYPE FILTER SYSTEM MODEL 12P-2

    Science.gov (United States)

    The verification test of the SeparmaticTM DE Pressure Type Filter System Model 12P-2 was conducted at the UNH Water Treatment Technology Assistance Center (WTTAC) in Durham, New Hampshire. The source water was finished water from the Arthur Rollins Treatment Plant that was pretr...

  15. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  16. Development of the code for filter calculation

    International Nuclear Information System (INIS)

    Gritzay, O.O.; Vakulenko, M.M.

    2012-01-01

    This paper describes a calculation method, which commonly used in the Neutron Physics Department to develop a new neutron filter or to improve the existing neutron filter. This calculation is the first step of the traditional filter development procedure. It allows easy selection of the qualitative and quantitative contents of a composite filter in order to receive the filtered neutron beam with given parameters

  17. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  18. Development of circular filters for active facilities

    International Nuclear Information System (INIS)

    Pratt, R.P.

    1986-01-01

    An assessment of problems associated with remote handling, changing and disposal of filters suggested that significant improvements to filtration systems could be made if circular geometries were adopted in place of conventional systems. Improved systems have been developed and are now available for a range of applications and air flow rates. Where primary filters are installed within the active cell or cave, circular filters incorporating a lip seal have been developed which enable the filters to be sealed into the facility without recourse to clamping. For smaller cells, a range of push-through filter change systems have been developed, the principal feature being that the filter is passed into the housing from the clean side, but transferred from the housing directly into the cell for subsequent disposal. For plant room applications, circular bag change canister systems have been developed which ease the sealing and bag change operation. Such systems have a rated air flow of up to 3000 m 3 /h whilst still allowing ultimate disposal via the 200 litre waste drum route without prior volume reduction of the filter inserts. (author)

  19. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  20. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  1. Development of requirements tracking and verification technology for the NPP software

    International Nuclear Information System (INIS)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-01-01

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification

  2. Development of Genetic Markers for Triploid Verification of the Pacific Oyster,

    Directory of Open Access Journals (Sweden)

    Jung-Ha Kang

    2013-07-01

    Full Text Available The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters.

  3. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...

  4. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  5. Technology development for producing nickel metallic filters

    International Nuclear Information System (INIS)

    Hubler, C.H.

    1990-01-01

    A technology to produce metallic filters by Instituto de Engenharia Nuclear (IEN-Brazilian CNEN) providing the Instituto de Pesquisas Energeticas e Nucleares (IPEN-Brazilian CNEN) in obtaining nickel alloy filters used for filtration process of uranium hexafluoride, was developed. The experiences carried out for producing nickel conical trunk filters from powder metallurgy are related. (M.C.K.)

  6. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  7. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  8. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  9. Development of filters and housings for use on active plant

    International Nuclear Information System (INIS)

    Hackney, S.; Pratt, R.P.

    1983-01-01

    New designs of housings for conventional HEPA filters have been developed and are now in use. A further design is planned for future use. The main features to be developed are the engineering of double door systems to replace bag posting and other methods of filter changing which expose personnel to hazardous environments and the addition of a secondary containment to reduce the role of the gasket seal in the filtration efficiency. Also under development are circular geometry filters of HEPA standard which offer significant advantages over rectangular filters for applications requiring remote shielded change facilities. Two types of filter construction are being evaluated, conventional radial flow cartridge filters and spiral-wound, axial-flow filters. The application of circular filters for primary filter systems on active plant is in hand. A push-through change system has been developed for a new cell facility under construction at Harwell. Existing rectangular filters on a high activity cell are being replaced with clusters of small cartridge filters to overcome changing and disposal problems. A similar system but using 1700 m 3 /h filters for large volume off-gas treatment is also being studied. A remote change shielded filter installation is being developed for use in high alpha, beta, gamma extract systems. The design incorporates large cartridge filters in sealed drums with remote transfer and connection to duct work in the facility. A novel application of the use of double-lid technology removes the need for separate shut off dampers and enables the drums to be sealed for all transfer operations

  10. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  11. Single-Phase LLCL-Filter-based Grid-Tied Inverter with Low-Pass Filter Based Capacitor Current Feedback Active damper

    DEFF Research Database (Denmark)

    Liu, Yuan; Wu, Weimin; Li, Yun

    2016-01-01

    The capacitor-current-feedback active damping method is attractive for high-order-filter-based high power grid-tied inverter when the grid impedance varies within a wide range. In order to improve the system control bandwidth and attenuate the high order grid background harmonics by using the quasi....... In this paper, a low pass filter is proposed to be inserted in the capacitor current feedback loop op LLCL-filter based grid-tied inverter together with a digital proportional and differential compensator. The detailed theoretical analysis is given. For verification, simulations on a 2kW/220V/10kHz LLCL...

  12. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  13. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  14. Face identification with frequency domain matched filtering in mobile environments

    Science.gov (United States)

    Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan

    2012-06-01

    Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.

  15. Monte Carlo filters for identification of nonlinear structural dynamical ...

    Indian Academy of Sciences (India)

    The theory of Kalman filtering provides one of ...... expansion (appendix B contains a reasonably self-contained account of how such expansions ...... Shinozuka M, Ghanem R 1995 Structural system identification II: experimental verification.

  16. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  17. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  19. ETV TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS GLASFLOSS INDUSTRIES EXCEL FILTER, MODEL SBG24242898

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Excel Filter, Model SBG24242898 air filter for dust and bioaerosol filtration manufactured by Glasfloss Industries, Inc. The pressure drop across the filter was 82 Pa clean and 348 Pa...

  20. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  1. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    Energy Technology Data Exchange (ETDEWEB)

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P. [Euratom, Communaute europeenne de l' energie atomique - CEEA (European Commission (EC))

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  2. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  3. Voice preprocessing system incorporating a real-time spectrum analyzer with programmable switched-capacitor filters

    Science.gov (United States)

    Knapp, G.

    1984-01-01

    As part of a speaker verification program for BISS (Base Installation Security System), a test system is being designed with a flexible preprocessing system for the evaluation of voice spectrum/verification algorithm related problems. The main part of this report covers the design, construction, and testing of a voice analyzer with 16 integrating real-time frequency channels ranging from 300 Hz to 3 KHz. The bandpass filter response of each channel is programmable by NMOS switched capacitor quad filter arrays. Presently, the accuracy of these units is limited to a moderate precision by the finite steps of programming. However, repeatability of characteristics between filter units and sections seems to be excellent for the implemented fourth-order Butterworth bandpass responses. We obtained a 0.1 dB linearity error of signal detection and measured a signal-to-noise ratio of approximately 70 dB. The proprocessing system discussed includes preemphasis filter design, gain normalizer design, and data acquisition system design as well as test results.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES SL-3 RING PANEL

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the High Efficiency Mini Pleat air filter for dust and bioaerosol filtration manufactured by Columbus Industries. The pressure drop across the filter was 142 Pa clean and 283 Pa dust load...

  5. Multidimensional filter banks and wavelets research developments and applications

    CERN Document Server

    Levy, Bernard

    1997-01-01

    Multidimensional Filter Banks and Wavelets: Reserach Developments and Applications brings together in one place important contributions and up-to-date research results in this important area. Multidimensional Filter Banks and Wavelets: Research Developments and Applications serves as an excellent reference, providing insight into some of the most important research issues in the field.

  6. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - TETRATEC PTFE TECHNOLOGIES TETRATEX 8005

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  8. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  9. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  10. DEVELOPMENT OF AN ADHESIVE CANDLE FILTER SAFEGUARD DEVICE; F

    International Nuclear Information System (INIS)

    John P. Hurley; Ann K. Henderson; Jan W. Nowok; Michael L. Swanson

    2002-01-01

    In order to reach the highest possible efficiencies in a coal-fired turbine-based power system, the turbine should be directly fired with the products of coal conversion. Two main types of systems employ these turbines: those based on pressurized fluidized-bed combustors and those based on integrated gasification combined cycles. In both systems, suspended particulates must be cleaned from the gas stream before it enters the turbine so as to prevent fouling and erosion of the turbine blades. To produce the cleanest gas, barrier filters are being developed and are in use in several facilities. Barrier filters are composed of porous, high-temperature materials that allow the hot gas to pass but collect the particulates on the surface. The three main configurations of the barrier filters are candle, cross-flow, and tube filters. Both candle and tube filters have been tested extensively. They are composed of coarsely porous ceramic that serves as a structural support, overlain with a thin, microporous ceramic layer on the dirty gas side that serves as the primary filter surface. They are highly efficient at removing particulate matter from the gas stream and, because of their ceramic construction, are resistant to gas and ash corrosion. However, ceramics are brittle and individual elements can fail, allowing particulates to pass through the hole left by the filter element and erode the turbine. Preventing all failure of individual ceramic filter elements is not possible at the present state of development of the technology. Therefore, safeguard devices (SGDs) must be employed to prevent the particulates streaming through occasional broken filters from reaching the turbine. However, the SGD must allow for the free passage of gas when it is not activated. Upon breaking of a filter, the SGD must either mechanically close or quickly plug with filter dust to prevent additional dust from reaching the turbine. Production of a dependable rapidly closing autonomous mechanical

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS:AAF INTERNATIONAL, PERFECTPLEAT ULTRA, 175-102-863

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the PerfectPleat Ultra 175-102-863 air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 112 Pa clean and 229 Pa dust lo...

  12. Developing topic-specific search filters for PubMed with click-through data.

    Science.gov (United States)

    Li, J; Lu, Z

    2013-01-01

    Search filters have been developed and demonstrated for better information access to the immense and ever-growing body of publications in the biomedical domain. However, to date the number of filters remains quite limited because the current filter development methods require significant human efforts in manual document review and filter term selection. In this regard, we aim to investigate automatic methods for generating search filters. We present an automated method to develop topic-specific filters on the basis of users' search logs in PubMed. Specifically, for a given topic, we first detect its relevant user queries and then include their corresponding clicked articles to serve as the topic-relevant document set accordingly. Next, we statistically identify informative terms that best represent the topic-relevant document set using a background set composed of topic irrelevant articles. Lastly, the selected representative terms are combined with Boolean operators and evaluated on benchmark datasets to derive the final filter with the best performance. We applied our method to develop filters for four clinical topics: nephrology, diabetes, pregnancy, and depression. For the nephrology filter, our method obtained performance comparable to the state of the art (sensitivity of 91.3%, specificity of 98.7%, precision of 94.6%, and accuracy of 97.2%). Similarly, high-performing results (over 90% in all measures) were obtained for the other three search filters. Based on PubMed click-through data, we successfully developed a high-performance method for generating topic-specific search filters that is significantly more efficient than existing manual methods. All data sets (topic-relevant and irrelevant document sets) used in this study and a demonstration system are publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/downloads/CQ_filter/

  13. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  14. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  15. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based......The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... on dictionary-based noise reduction and compare it to the baseline methods....

  16. Experience with HEPA filters at United States nuclear installations

    International Nuclear Information System (INIS)

    Bellamy, R.R.

    1977-01-01

    Part 50 of Title 10 of the United States Code of Federal Regulations requires that a number of atmosphere cleanup systems be included in the design of commercial nuclear power plants to be licensed in the United States. These filtering systems are to contain high efficiency particulate air (HEPA) filters for removal of radioactive particulate matter generated during normal and accident conditions. Recommendations for the design, testing and maintenance of the filtering systems and HEPA filter components are contained in a number of United States Nuclear Regulatory Commission documents and industry standards. This paper will discuss this published guidance available to designers of filtering systems and the plant operators of U.S. commercial nuclear power plants. The paper will also present a survey of published reports of experience with HEPA filters, failures and possible causes for the failures, and other abnormal occurrences pertaining to HEPA filters installed in U.S. nuclear power installations. A discussion will be included of U.S. practices for qualification of HEPA filters before installation, and verification of continued performance capability at scheduled intervals during operation

  17. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  18. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  19. Development of membrane filters with nanostructured porous layer by coating of metal nanoparticles sintered onto a micro-filter

    International Nuclear Information System (INIS)

    Park, Seok Joo; Park, Young Ok; Lee, Dong Geun; Ryu, Jeong In

    2008-01-01

    The membrane filter adhered with nanostructured porous layer was made by heat treatment after deposition of nanoparticle-agglomerates sintered in aerosol phase onto a conventional micron-fibrous metal filter as a substrate filter. The Sintered-Nanoparticle-Agglomerates-coated NanoStructured porous layer Membrane Filter (SNA-NSMF), whose the filtration performance was improved compared with the conventional metal membrane filters, was developed by adhesion of nanoparticle-agglomerates of dendrite structure sintered onto the micron-fibrous metal filter. The size of nanoparticle-agglomerates of dendrite structure decreased with increasing the sintering temperature because nanoparticle-agglomerates shrank. When shrinking nanoparticle-agglomerates were deposited and treated with heat onto the conventional micron-fibrous metal filter, pore size of nanostructured porous layer decreased. Therefore, pressure drops of SNA-NSMFs increased from 0.3 to 0.516 KPa and filtration efficiencies remarkably increased from 95.612 to 99.9993%

  20. Optimal Design of High-Order Passive-Damped Filters for Grid-Connected Applications

    DEFF Research Database (Denmark)

    Beres, Remus Narcis; Wang, Xiongfei; Blaabjerg, Frede

    2016-01-01

    Harmonic stability problems caused by the resonance of high-order filters in power electronic systems are ever increasing. The use of passive damping does provide a robust solution to address these issues, but at the price of reduced efficiency due to the presence of additional passive components....... Hence, a new method is proposed in this paper to optimally design the passive damping circuit for the LCL filters and LCL with multi-tuned LC traps. In short, the optimization problem reduces to the proper choice of the multi-split capacitors or inductors in the high-order filter. Compared to existing...... filter resonance. The passive filters are designed, built and validated both analytically and experimentally for verification....

  1. Mechanical design and qualification of IR filter mounts and filter wheel of INSAT-3D sounder for low temperature

    Science.gov (United States)

    Vora, A. P.; Rami, J. B.; Hait, A. K.; Dewan, C. P.; Subrahmanyam, D.; Kirankumar, A. S.

    2017-11-01

    Next generation Indian Meteorological Satellite will carry Sounder instrument having subsystem of filter wheel measuring Ø260mm and carrying 18 filters arranged in three concentric rings. These filters made from Germanium, are used to separate spectral channels in IR band. Filter wheel is required to be cooled to 214K and rotated at 600 rpm. This Paper discusses the challenges faced in mechanical design of the filter wheel, mainly filter mount design to protect brittle germanium filters from failure under stresses due to very low temperature, compactness of the wheel and casings for improved thermal efficiency, survival under vibration loads and material selection to keep it lighter in weight. Properties of Titanium, Kovar, Invar and Aluminium materials are considered for design. The mount has been designed to accommodate both thermal and dynamic loadings without introducing significant aberrations into the optics or incurring permanent alignment shifts. Detailed finite element analysis of mounts was carried out for stress verification. Results of the qualification tests are discussed for given temperature range of 100K and vibration loads of 12g in Sine and 11.8grms in Random at mount level. Results of the filter wheel qualification as mounted in Electro Optics Module (EOM) are also presented.

  2. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  3. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  4. Development of a noise filter for radiation thickness gagemeter

    International Nuclear Information System (INIS)

    Jee, C. W.; Kim, Y. T.; Lee, H. H.

    1995-01-01

    The objective of this study is to develop a filter which attenuates sensor noises of radiation thickness gagemeters of the fifth stand of TCM No. 1 in Pohang steel works. The thickness control loop for the fifth stand is modelled as a system for filter design, where the system input is the speed control input and the system output is the gagemeter output. In the design of a filter, the system is described by an ARMAX(AutoRegressive Moving-Average with auXiliary input) model. The parameters of this model are then estimated by using a recursive least square method. Secondly, the ARMAX model, the estimated system, is transformed into an observer canonical state space form. Thirdly, Kalman filtering is applied to obtain optimal estimates of the state and hence those of thickness measurements of steel strips. In addition, a separate low pass filter is designed, which is directly applicable to the gagemeter outputs. Finally, the designed filter algorithms are implemented and tested on a VMEbus board computer under VxWorks real-time operating system. (author)

  5. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  6. Development and evaluation of a cleanable high efficiency steel filter

    International Nuclear Information System (INIS)

    Bergman, W.; Larsen, G.; Weber, F.; Wilson, P.; Lopez, R.; Valha, G.; Conner, J.; Garr, J.; Williams, K.; Biermann, A.; Wilson, K.; Moore, P.; Gellner, C.; Rapchun, D.; Simon, K.; Turley, J.; Frye, L.; Monroe, D.

    1993-01-01

    We have developed a high efficiency steel filter that can be cleaned in-situ by reverse air pulses. The filter consists of 64 pleated cylindrical filter elements packaged into a 6l0 x 6l0 x 292 mm aluminum frame and has 13.5 m 2 of filter area. The filter media consists of a sintered steel fiber mat using 2 μm diameter fibers. We conducted an optimization study for filter efficiency and pressure drop to determine the filter design parameters of pleat width, pleat depth, outside diameter of the cylinder, and the total number of cylinders. Several prototype cylinders were then built and evaluated in terms of filter cleaning by reverse air pulses. The results of these studies were used to build the high efficiency steel filter. We evaluated the prototype filter for efficiency and cleanability. The DOP filter certification test showed the filter has a passing efficiency of 99.99% but a failing pressure drop of 0.80 kPa at 1,700 m 3 /hr. Since we were not able to achieve a pressure drop less than 0.25 kPa, the steel filter does not meet all the criteria for a HEPA filter. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned by reverse air pulses. The next phase of the prototype evaluation consisted of installing the unit and support housing in the exhaust duct work of a uranium grit blaster for a field evaluation at the Y-12 Plant in Oak Ridge, TN. The grit blaster is used to clean the surface of uranium parts and generates a cloud of UO 2 aerosols. We used a 1,700 m 3 /hr slip stream from the 10,200 m 3 /hr exhaust system

  7. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  8. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    Science.gov (United States)

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  9. Development of the quickmix injector for in-situ filter testing

    International Nuclear Information System (INIS)

    Costigan, G.; Loughborough, D.

    1993-01-01

    In-situ filter testing is routinely carried out on nuclear ventilation plant to assess the effectiveness of installed filter systems. Ideally the system is tested by introducing a sub-micron aerosol upstream of the filter, in such a way as to present a uniform challenge to the whole of the upstream filter face. Samples are withdrawn from upstream and downstream of the filter, and the respective concentrations are used to calculate the system (or filter) efficiency. These requirements are documented in the Atomic Energy Code of Practice, AECP 1054. The Filter Development Section at Harwell Laboratory has been investigating methods of improving the accuracy and reliability of the in-situ filter test over the past ten years. The programme has included the evaluation of devices used to mix the aerosol and multi-point samplers to obtain representative aerosol samples. This paper reports the results of laboratory trials on the open-quotes QUICKMIXclose quotes injector developed and patented by Harwell. The Quickmix injector is designed to mix the test aerosol with the air stream and thereby reduce the duct length required to produce uniform concentrations. The injector has been tested in ducts ranging from 150 mm diameter to 610 mm square, at air velocities up to 26 m/s. Upstream mixing lengths required to achieve a ± 10% concentration variation on the mean were reduced to between 2 and 5 duct diameters, with a very small pressure drop. This simple, compact device is being installed in new and existing plants in the UK to improve the accuracy and reliability of in-situ filter testing. Some examples of plant applications are given, together with some of the first results from operating plant

  10. Development of filters for exhaust air or off-gas cleaning

    International Nuclear Information System (INIS)

    Wilhelm, J.

    1988-01-01

    The activities of the 'Laboratorium fuer Aerosolphysik und Filtertechnik II' of the 'Kernforschungszentrum Karlsruhe' concentrate on the development of filters to be used for cleaning nuclear and conventional exhaust air and off-gas. Originally, these techniques were intended to be applied in nuclear facilities only. Their application for conventional gas purification, however, has led to a reorientation of research and development projects. By way of example, it is reported about the use of the multi-way sorption filter for radioiodine removal in nuclear power plants and following flue-gas purification in heating power plants as well as for off-gas cleaning in chemical industry. The improvement of HEPA filters and the development of metal fibre filters has led to components which can be used in the range of high humidity and moisture as well as at high temperatures and an increased differential pressure. The experience obtained in the field of high-efficiency filtering of nuclear airborne particles is made use of during the investigations concerning the removal of particles of conventional pollutants in the submicron range. A technique of radioiodine removal and an improved removal of airborne particles has been developed for use in the future reprocessing plant. Thus, a maximum removal efficiency can be achieved and an optimum waste management is made possible. It is reported about the components obtained as a result of these activities and their use for off-gas cleaning in the Wackersdorf reprocessing plant (WAW). (orig.) [de

  11. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  12. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  13. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  14. Developing particulate thin filter using coconut fiber for motor vehicle emission

    Science.gov (United States)

    Wardoyo, A. Y. P.; Juswono, U. P.; Riyanto, S.

    2016-03-01

    Amounts of motor vehicles in Indonesia have been recognized a sharply increase from year to year with the increment reaching to 22 % per annum. Meanwhile motor vehicles produce particulate emissions in different sizes with high concentrations depending on type of vehicles, fuels, and engine capacity. Motor Particle emissions are not only to significantly contribute the atmosphric particles but also adverse to human health. In order to reduce the particle emission, it is needed a filter. This study was aimed to develop a thin filter using coconut fiber to reduce particulate emissions for motor vehicles. The filter was made of coconut fibers that were grinded into power and mixed with glues. The filter was tested by the measurements of particle concentrations coming out from the vehicle exhaust directly and the particle concentrations after passing through the filter. The efficiency of the filter was calculated by ratio of the particle concentrations before comming in the filter to the particle conentrations after passing through the filter. The results showed that the efficiency of the filter obtained more than 30 %. The efficiency increases sharply when a number of the filters are arranged paralelly.

  15. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  16. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  17. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  18. DEVELOPMENT OF AG-1 SECTION FI ON METAL MEDIA FILTERS - 9061

    International Nuclear Information System (INIS)

    Adamson, D.; Waggoner, C.A.

    2008-01-01

    Development of a metal media standard (FI) for ASME AG-1 (Code on Nuclear Air and Gas Treatment) has been under way for almost ten years. This paper will provide a brief history of the development process of this section and a detailed overview of its current content/status. There have been at least two points when dramatic changes have been made in the scope of the document due to feedback from the full Committee on Nuclear Air and Gas Treatment (CONAGT). Development of the proposed section has required resolving several difficult issues associated with scope; namely, filtering efficiency, operating conditions (media velocity, pressure drop, etc.), qualification testing, and quality control/acceptance testing. A proposed version of Section FI is currently undergoing final revisions prior to being submitted for balloting. The section covers metal media filters of filtering efficiencies ranging from medium (less than 99.97%) to high (99.97% and greater). Two different types of high efficiency filters are addressed; those units intended to be a direct replacement of Section FC fibrous glass HEPA filters and those that will be placed into newly designed systems capable of supporting greater static pressures and differential pressures across the filter elements. Direct replacements of FC HEPA filters in existing systems will be required to meet equivalent qualification and testing requirements to those contained in Section FC. A series of qualification and quality assurance test methods have been identified for the range of filtering efficiencies covered by this proposed standard. Performance characteristics of sintered metal powder vs. sintered metal fiber media are dramatically different with respect to parameters like differential pressures and rigidity of the media. Wide latitude will be allowed for owner specification of performance criteria for filtration units that will be placed into newly designed systems. Such allowances will permit use of the most

  19. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  20. Development of a computerized portal verification scheme for pelvic treatment fields

    International Nuclear Information System (INIS)

    Nie, K.; Yin, F.-F.; Gao, Q.; Brasacchio, R.

    1996-01-01

    Purpose/Objective: At present, treatment verification between portal and reference images is performed based on manually-identified features by radiation oncologist, which is both time-consuming and potentially error-prone. There is a demand for the computerized verification procedure in clinical application. The purpose of this study is to develop a computerized portal verification scheme for pelvic treatment fields. Materials/Methods: The automated verification system involves image acquisition, image feature extraction, feature matching between reference and portal images and quantitative evaluation of patient setup. Electronic portal images with a matrix size of 256 x 256 and 12 bit gray levels were acquired using a liquid matrix electronic portal imaging device. Simulation images were acquired by digitizing simulation films using a TV camera into images with 256 x 256 matrix and 8 bit gray levels. Initially a Canny edge detector is applied to identify the field edges and an elliptic Fourier transformation is used to correlate the size and shape information between the reference and portal field edges. Several measures can be calculated using the transformation coefficients to describe the field shape, size and orientation. The quantitative information regarding to the relative shifts, rotation and magnification factor between portal and reference field edges can then be determined based on these measures. Next the pelvic brim, which is typically used as the landmark for radiation treatment verification, is identified by a pyramid searching process with double snakes defined from initial global area to final local area. A snake is an active model and energy-minimizing spline guided by external constraint forces and influenced by image forces that pull it toward features such as lines and edges. The search range is limited to the region between two snakes. Sobel edge detector and wavelet transformation approach are used to generate a serial image forces at

  1. Filter testing and development for prolonged transuranic service and waste reduction

    International Nuclear Information System (INIS)

    Geer, J.A.; Buttedahl, O.I.; Skaats, C.D.; Terada, K.; Woodard, R.W.

    1977-02-01

    The life of High Efficiency Particulate Air (HEPA) filters used in transuranic service is influenced greatly by the gaseous and particulate matter to which the filters are exposed. The most severe conditions encountered at Rocky Flats are at the ventilation systems serving the plutonium recovery operations in Bldg. 771. A project of filter testing and development for prolonged transuranic service and waste reduction was formally initiated at Rocky Flats on July 1, 1975. The project is directed toward improving filtration methods which will prolong the life of HEPA filter systems without sacrificing effectiveness. Another important aspect of the project is to reduce the volume of HEPA filter waste shipped from the plant for long-term storage. Progress to September 30, 1976, is reported

  2. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  3. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  4. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author)

  5. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  6. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  7. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  8. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  9. Development and Implementation of Cgcre Accreditation Program for Greenhouse Gas Verification Bodies

    International Nuclear Information System (INIS)

    Fermam, Ricardo Kropf Santos; De Queiroz, Andrea Barroso Melo Monteiro

    2016-01-01

    An organizational innovation is defined as the implementation of a new organizational method in the firm's business practices, organization of your workplace or in its external relations. This work illustrates a Cgcre innovation, by presentation of the development process of greenhouse gases verification body in Brazil according to the Brazilian accreditation body, the General Coordination for Accreditation (Cgcre). (paper)

  10. Development of acid-resistant HEPA filter components

    International Nuclear Information System (INIS)

    Terada, K.; Woodard, R.W.; Buttedahl, O.I.

    1981-01-01

    Laboratory and in-service tests of various HEPA filter media and separators were conducted to establish their relative resistances to HNO 3 -HF vapors. Filter medium of glass fiber with Nomex additive and aluminum separators with an epoxy-vinyl coating have performed quite well in the acid environment in the laboratory, and in prototype-filters placed in service in a plenum at Rocky Flats. Proprietary filters with new design and/or components were also tested in service with generally good results

  11. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  12. The development rainfall forecasting using kalman filter

    Science.gov (United States)

    Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala

    2018-04-01

    Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.

  13. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  14. Estimation of aircraft aerodynamic derivatives using Extended Kalman Filter

    OpenAIRE

    Curvo, M.

    2000-01-01

    Design of flight control laws, verification of performance predictions, and the implementation of flight simulations are tasks that require a mathematical model of the aircraft dynamics. The dynamical models are characterized by coefficients (aerodynamic derivatives) whose values must be determined from flight tests. This work outlines the use of the Extended Kalman Filter (EKF) in obtaining the aerodynamic derivatives of an aircraft. The EKF shows several advantages over the more traditional...

  15. Development of active porous medium filters based on plasma textiles

    International Nuclear Information System (INIS)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren

    2012-01-01

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath (''plasma shield'') that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  16. Development of active porous medium filters based on plasma textiles

    Science.gov (United States)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren

    2012-05-01

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath ("plasma shield") that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  17. Development of active porous medium filters based on plasma textiles

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren [Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Textile Engineering Chemistry and Science, North Carolina State University, Raleigh, NC 27695 (United States)

    2012-05-15

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath (''plasma shield'') that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  18. Design and development of laser eye protection filter

    International Nuclear Information System (INIS)

    Ahmed, K; Khan, A N; Rauf, A; Gul, A; Aslam, M

    2013-01-01

    Laser based devices, have been operational for measurement of distances horizontally and vertically in avionics and surveillance industries. These equipments are functional on pulsed Nd:YAG laser at 1064nm, this wavelength elevate the risk of eye exposure to personnel at unexpected levels. In this paper the eye protection filters, for the wavelength 1064nm were developed with soft (ZnS) and hard (TiO 2 ) coating materials by using thin film vacuum coating technique. The damage threshold of the filter is 0.2 J/cm 2 . Transmission characteristics are measured and discussed. Optical damage threshold (for eye 5 × 10 −6 J/cm2) at various distances is also simulated.

  19. Progress on the development of NbZr Radio frequency band reject filters

    International Nuclear Information System (INIS)

    Hudak, J.J.; Alper, M.; Cotte, D.; Gardner, C.G.; Harvey, A.

    1983-01-01

    This chapter reports on the design and testing of a tunable superconducting filter element fabricated from Nb25%Zr having a transition temperature of 11 K. The filter element will serve as a component in a multielement filter bank to be cooled to less than 10 K by a two stage Gifford-McMahon refrigerator. A radio frequency (RF) interference rejection system composed of a set of tunable superconducting filter elements is being developed to supplement conventional interference rejection tehcniques. The thermal loading performance of the 8.5 K Gifford-McMahon refrigerator is found to exceed 2 watts at 10 K on the second stage with a 10 watt loading on the first stage. A superconducting filter bank consisting of tunable narrow band RF filters applied to strong interfering signals can be used to match the dynamic range of the RF signal environment to that of the receiving system

  20. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  1. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    International Nuclear Information System (INIS)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh

    1995-07-01

    This is the '94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author)

  2. Autonomic networking-on-chip bio-inspired specification, development, and verification

    CERN Document Server

    Cong-Vinh, Phan

    2011-01-01

    Despite the growing mainstream importance and unique advantages of autonomic networking-on-chip (ANoC) technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The FIRST Book to Assess Research Results, Opportunities, & Trends in ""BioChipNets"" The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent re

  3. Suppression of narrow-band interference in a PN spread-spectrum receiver using a CTD-based adaptive filter

    Science.gov (United States)

    Saulnier, G. J.; Das, P.; Milstein, L. B.

    1984-11-01

    Analytical results have shown that adaptive filtering can be a powerful tool for the rejection of narrow-band interference in a spread-spectrum receiver. However, the complexity of adaptive filtering hardware has hindered the experimental verification of these results. This paper describes a new adaptive filter architecture for implementing the Widrow-Hoff LMS algorithm while using only two multipliers regardless of filter order. This hardware simplification is achieved through the use of a burst processing technique. A 16-tap version of this adaptive filter constructed using charge-transfer devices (CTD's) is used to suppress a single tone jammer in a direct sequence spread-spectrum receiver. Probability of error measurements demonstrating the effectiveness of the adaptive filter for suppressing the single tone jammer along with simulation results for the optimal Weiner-Hopf filter are presented and discussed.

  4. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  5. Development, characterization, and modeling of a tunable filter camera

    Science.gov (United States)

    Sartor, Mark Alan

    1999-10-01

    This paper describes the development, characterization, and modeling of a Tunable Filter Camera (TFC). The TFC is a new multispectral instrument with electronically tuned spectral filtering and low-light-level sensitivity. It represents a hybrid between hyperspectral and multispectral imaging spectrometers that incorporates advantages from each, addressing issues such as complexity, cost, lack of sensitivity, and adaptability. These capabilities allow the TFC to be applied to low- altitude video surveillance for real-time spectral and spatial target detection and image exploitation. Described herein are the theory and principles of operation for the TFC, which includes a liquid crystal tunable filter, an intensified CCD, and a custom apochromatic lens. The results of proof-of-concept testing, and characterization of two prototype cameras are included, along with a summary of the design analyses for the development of a multiple-channel system. A significant result of this effort was the creation of a system-level model, which was used to facilitate development and predict performance. It includes models for the liquid crystal tunable filter and intensified CCD. Such modeling was necessary in the design of the system and is useful for evaluation of the system in remote-sensing applications. Also presented are characterization data from component testing, which included quantitative results for linearity, signal to noise ratio (SNR), linearity, and radiometric response. These data were used to help refine and validate the model. For a pre-defined source, the spatial and spectral response, and the noise of the camera, system can now be predicted. The innovation that sets this development apart is the fact that this instrument has been designed for integrated, multi-channel operation for the express purpose of real-time detection/identification in low- light-level conditions. Many of the requirements for the TFC were derived from this mission. In order to provide

  6. A Cherenkov viewing device for used-fuel verification

    International Nuclear Information System (INIS)

    Attas, E.M.; Chen, J.D.; Young, G.J.

    1990-01-01

    A Cherenkov viewing device (CVD) has been developed to help verify declared inventories of used nuclear fuel stored in water bays. The device detects and amplifies the faint ultraviolet Cherenkov glow from the water surrounding the fuel, producing a real-time visible image on a phosphor screen. Quartz optics, a UV-pass filter and a microchannel-plate image-intensifier tube serve to form the image, which can be photographed or viewed directly through an eyepiece. Normal fuel bay lighting does not interfere with the Cherenkov light image. The CVD has been successfully used to detect anomalous PWR, BWR and CANDU (CANada Deuterium Uranium: registered trademark) fuel assemblies in the presence of normal-burnup assemblies stored in used-fuel bays. The latest version of the CVD, known as Mark IV, is being used by inspectors from the International Atomic Energy agency for verification of light-water power-reactor fuel. Its design and operation are described, together with plans for further enhancements of the instrumentation. (orig.)

  7. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  8. An Investigation into Solution Verification for CFD-DEM

    Energy Technology Data Exchange (ETDEWEB)

    Fullmer, William D. [National Energy Technology Lab. (NETL), AECOM, Morgantown, WV (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2017-10-01

    This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of an experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing

  9. H-/H∞ structural damage detection filter design using an iterative linear matrix inequality approach

    International Nuclear Information System (INIS)

    Chen, B; Nagarajaiah, S

    2008-01-01

    The existence of damage in different members of a structure can be posed as a fault detection problem. It is also necessary to isolate structural members in which damage exists, which can be posed as a fault isolation problem. It is also important to detect the time instants of occurrence of the faults/damage. The structural damage detection filter developed in this paper is a model-based fault detection and isolation (FDI) observer suitable for detecting and isolating structural damage. In systems, possible faults, disturbances and noise are coupled together. When system disturbances and sensor noise cannot be decoupled from faults/damage, the detection filter needs to be designed to be robust to disturbances as well as sensitive to faults/damage. In this paper, a new H - /H ∞ and iterative linear matrix inequality (LMI) technique is developed and a new stabilizing FDI filter is proposed, which bounds the H ∞ norm of the transfer function from disturbances to the output residual and simultaneously does not degrade the component of the output residual due to damage. The reduced-order error dynamic system is adopted to form bilinear matrix inequalities (BMIs), then an iterative LMI algorithm is developed to solve the BMIs. The numerical example and experimental verification demonstrate that the proposed algorithm can successfully detect and isolate structural damage in the presence of measurement noise

  10. Further development and verification of the calculating programme Felix for the simulation of criticality excursions

    International Nuclear Information System (INIS)

    Weber, J.; Denk, W.

    1985-01-01

    An improved version of the FELIX programme was applied to varify excursion experiments 01, 03 through 07, and 13. The correspondence of experiments and verification was good. Programme points to be further developed are shown. (orig.) [de

  11. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, D [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Testa, M; Park, Y [Massachusetts General Hospital, Boston, MA (United States); Schneider, R; Moteabbed, M [General Hospital, Boston, MA (United States); Janssens, G; Prieels, D [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Orban de Xivry, J [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Lu, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Bentefour, E

    2014-06-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.

  12. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    International Nuclear Information System (INIS)

    Samuel, D; Testa, M; Park, Y; Schneider, R; Moteabbed, M; Janssens, G; Prieels, D; Orban de Xivry, J; Lu, H; Bentefour, E

    2014-01-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient

  13. Determining the Accuracy of Crowdsourced Tweet Verification for Auroral Research

    Directory of Open Access Journals (Sweden)

    Nathan A. Case

    2016-12-01

    Full Text Available The Aurorasaurus project harnesses volunteer crowdsourcing to identify sightings of an aurora (the “northern/southern lights” posted by citizen scientists on Twitter. Previous studies have demonstrated that aurora sightings can be mined from Twitter with the caveat that there is a large background level of non-sighting tweets, especially during periods of low auroral activity. Aurorasaurus attempts to mitigate this, and thus increase the quality of its Twitter sighting data, by using volunteers to sift through a pre-filtered list of geolocated tweets to verify real-time aurora sightings. In this study, the current implementation of this crowdsourced verification system, including the process of geolocating tweets, is described and its accuracy (which, overall, is found to be 68.4% is determined. The findings suggest that citizen science volunteers are able to accurately filter out unrelated, spam-like, Twitter data but struggle when filtering out somewhat related, yet undesired, data. The citizen scientists particularly struggle with determining the real-time nature of the sightings, so care must be taken when relying on crowdsourced identification.

  14. Development of anti-debris filter for WWER-440 working fuel assembly

    International Nuclear Information System (INIS)

    Kolosovsky, V.; Aksyonov, P.; Kukushkin, Y.; Molchanov, V.; Kolobaev, A.

    2006-01-01

    Mechanical damaging of the fuel rod claddings caused by debris is one of the main reasons for fuel assembly failures. The paper focuses on the program and results of experimental and design activities carried out by Russian organizations relating to the development and investigation of operational characteristics of anti-debris filters for WWER-440 working fuel assemblies. Lead working fuel assemblies equipped with anti-debris filters have been loaded in the core of Kola-2 NPP. The results obtained can be used for making the decision concerning the application of anti-debris filter for WWER-440 working fuel assemblies with the purpose of enhancing their debris-resistance properties. (authors)

  15. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  16. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  17. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  18. Rectifier Filters

    Directory of Open Access Journals (Sweden)

    Y. A. Bladyko

    2010-01-01

    Full Text Available The paper contains definition of a smoothing factor which is suitable for any rectifier filter. The formulae of complex smoothing factors have been developed for simple and complex passive filters. The paper shows conditions for application of calculation formulae and filters

  19. Development of an elution device for ViroCap virus filters.

    Science.gov (United States)

    Fagnant, Christine Susan; Toles, Matthew; Zhou, Nicolette Angela; Powell, Jacob; Adolphsen, John; Guan, Yifei; Ockerman, Byron; Shirai, Jeffry Hiroshi; Boyle, David S; Novosselov, Igor; Meschke, John Scott

    2017-10-19

    Environmental surveillance of waterborne pathogens is vital for monitoring the spread of diseases, and electropositive filters are frequently used for sampling wastewater and wastewater-impacted surface water. Viruses adsorbed to electropositive filters require elution prior to detection or quantification. Elution is typically facilitated by a peristaltic pump, although this requires a significant startup cost and does not include biosafety or cross-contamination considerations. These factors may pose a barrier for low-resource laboratories that aim to conduct environmental surveillance of viruses. The objective of this study was to develop a biologically enclosed, manually powered, low-cost device for effectively eluting from electropositive ViroCap™ virus filters. The elution device described here utilizes a non-electric bilge pump, instead of an electric peristaltic pump or a positive pressure vessel. The elution device also fully encloses liquids and aerosols that could contain biological organisms, thereby increasing biosafety. Moreover, all elution device components that are used in the biosafety cabinet are autoclavable, reducing cross-contamination potential. This device reduces costs of materials while maintaining convenience in terms of size and weight. With this new device, there is little sample volume loss due to device inefficiency, similar virus yields were demonstrated during seeded studies with poliovirus type 1, and the time to elute filters is similar to that required with the peristaltic pump. The efforts described here resulted in a novel, low-cost, manually powered elution device that can facilitate environmental surveillance of pathogens through effective virus recovery from ViroCap filters while maintaining the potential for adaptability to other cartridge filters.

  20. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  1. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  2. The measurement of X-rays radiation temperature with a new developed filter-fluorescence spectroscopy

    International Nuclear Information System (INIS)

    Zhang Chuanfei; Lin Libin; Lou Fuhong; Peng Taiping

    2001-01-01

    The author introduces how to measure the energy spectra of X-rays by filter-fluorescence spectroscopy. The design principle and structure of new-developed double diaphragms and filter-fluorescence spectroscopy with 5 channels are depicted. The parameters of optimized spectroscopy by numerical method are given. The filter-fluorescence spectroscopy designed according as Rousseau balance principle improves signal-noises ratio

  3. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  4. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  5. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  6. Filter Selection for Optimizing the Spectral Sensitivity of Broadband Multispectral Cameras Based on Maximum Linear Independence.

    Science.gov (United States)

    Li, Sui-Xian

    2018-05-07

    Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI). However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ₂ norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.

  7. Filter Selection for Optimizing the Spectral Sensitivity of Broadband Multispectral Cameras Based on Maximum Linear Independence

    Directory of Open Access Journals (Sweden)

    Sui-Xian Li

    2018-05-01

    Full Text Available Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI. However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ2 norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.

  8. Person-Specific Face Detection in a Scene with Optimum Composite Filtering and Colour-Shape Information

    Directory of Open Access Journals (Sweden)

    Seokwon Yeom

    2013-01-01

    Full Text Available Face detection and recognition have wide applications in robot vision and intelligent surveillance. However, face identification at a distance is very challenging because long-distance images are often degraded by low resolution, blurring and noise. This paper introduces a person-specific face detection method that uses a nonlinear optimum composite filter and subsequent verification stages. The filter's optimum criterion minimizes the sum of the output energy generated by the input noise and the input image. The composite filter is trained with several training images under long-distance modelling. The candidate facial regions are provided by the filter's outputs of the input scene. False alarms are eliminated by subsequent testing stages, which comprise skin colour and edge mask filtering tests. In the experiments, images captured by a webcam and a CCTV camera are processed to show the effectiveness of the person-specific face detection system at a long distance.

  9. The development of a HEPA filter with improved dust holding characteristics

    International Nuclear Information System (INIS)

    Dyment, J.; Hamblin, C.

    1995-01-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of open-quotes graded densityclose quotes papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M 3 h -1 ) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts

  10. Cleanup Verification Package for the 118-F-1 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  11. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  12. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  13. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  14. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  15. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  16. Design of adaptive filter amplifier in UV communication based on DSP

    Science.gov (United States)

    Lv, Zhaoshun; Wu, Hanping; Li, Junyu

    2016-10-01

    According to the problem of the weak signal at receiving end in UV communication, we design a high gain, continuously adjustable adaptive filter amplifier. Based on proposing overall technical indicators and analyzing its working principle of the signal amplifier, we use chip LMH6629MF and two chips of AD797BN to achieve three-level cascade amplification. And apply hardware of DSP TMS320VC5509A to implement digital filtering. Design and verification by Multisim, Protel 99SE and CCS, the results show that: the amplifier can realize continuously adjustable amplification from 1000 to 10000 times without distortion. Magnification error is <=%4@1000 10000. And equivalent input noise voltage of amplification circuit is <=6 nV/ √Hz @30KHz 45KHz, and realizing function of adaptive filtering. The design provides theoretical reference and technical support for the UV weak signal processing.

  17. Development and verification of the neutron diffusion solver for the GeN-Foam multi-physics platform

    International Nuclear Information System (INIS)

    Fiorina, Carlo; Kerkar, Nordine; Mikityuk, Konstantin; Rubiolo, Pablo; Pautz, Andreas

    2016-01-01

    Highlights: • Development and verification of a neutron diffusion solver based on OpenFOAM. • Integration in the GeN-Foam multi-physics platform. • Implementation and verification of acceleration techniques. • Implementation of isotropic discontinuity factors. • Automatic adjustment of discontinuity factors. - Abstract: The Laboratory for Reactor Physics and Systems Behaviour at the PSI and the EPFL has been developing in recent years a new code system for reactor analysis based on OpenFOAM®. The objective is to supplement available legacy codes with a modern tool featuring state-of-the-art characteristics in terms of scalability, programming approach and flexibility. As part of this project, a new solver has been developed for the eigenvalue and transient solution of multi-group diffusion equations. Several features distinguish the developed solver from other available codes, in particular: object oriented programming to ease code modification and maintenance; modern parallel computing capabilities; use of general unstructured meshes; possibility of mesh deformation; cell-wise parametrization of cross-sections; and arbitrary energy group structure. In addition, the solver is integrated into the GeN-Foam multi-physics solver. The general features of the solver and its integration with GeN-Foam have already been presented in previous publications. The present paper describes the diffusion solver in more details and provides an overview of new features recently implemented, including the use of acceleration techniques and discontinuity factors. In addition, a code verification is performed through a comparison with Monte Carlo results for both a thermal and a fast reactor system.

  18. Development of nuclear standard filter elements for PWR plant

    International Nuclear Information System (INIS)

    Weng Minghui; Wu Jidong; Gu Xiuzhang; Zhang Jinghua

    1988-11-01

    Model FRX-5 and FRX-10 nuclear standard filter elements are used for the fluid clarification of the chemical and volume control system (CVCS), boron recycle system (BRS), spent fuel pit cooling system (SFPCS) and steam generator blowdown system (SGBS) in Qinshan Nuclear Power Plant. The radioactive contaminant, fragment of resin and impurity are collected by these filter elements, The core of filter elements consists of polypropylene frames and paper filter medium bonded by resin. A variety of filter papers are tested for optimization. The flow rate and comprehensive performance have been measured in the simulation condition. The results showed that the performance and lifetime have met the designing requirements. The advantages of the filter elements are simple in manufacturing, less expense and facilities for waste-disposal. At present, some of filter elements have been produced and put in operation

  19. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  20. Development of Test Protocols for International Space Station Particulate Filters

    Science.gov (United States)

    Vijayakumar, R.; Green, Robert D.; Agui, Juan H.

    2015-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High-Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. The filter element for this system has a non-standard cross-section with a length-to-width ratio (LW) of 6.6. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE.Given the engineering constraints in designing spacecraft life support systems, it is anticipated that non-industry standard filters will be required in future designs. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters. These techniques for characterizing the test duct and perform leak testing

  1. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  2. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  3. Development of discrete-time H∞ filtering method for time-delay compensation of rhodium incore detectors

    International Nuclear Information System (INIS)

    Park, Moon Kyu; Kim, Yong Hee; Cha, Kune Ho; Kim, Myung Ki

    1998-01-01

    A method is described to develop an H∞ filtering method for the dynamic compensation of self-powered neutron detectors normally used for fixed incore instruments. An H∞ norm of the filter transfer matrix is used as the optimization criteria in the worst-case estimation error sense. Filter modeling is performed for discrete-time model. The filter gains are optimized in the sense of noise attenuation level of H∞ setting. By introducing Bounded Real Lemma, the conventional algebraic Riccati inequalities are converted into Linear Matrix Inequalities (LMIs). Finally, the filter design problem is solved via the convex optimization framework using LMIs. The simulation results show that remarkable improvements are achieved in view of the filter response time and the filter design efficiency

  4. A method for reducing energy dependence of thermoluminescence dosimeter response by means of filters

    International Nuclear Information System (INIS)

    Bapat, V.N.

    1980-01-01

    This work describes the application of the method of partial surface shielding for reducing the energy dependence of the X-ray and γ-ray response of a dosimeter containing a CaSO 4 :Dy thermoluminescent phosphor mixed with KCl. in pellet form. Results are given of approximate computation of filter combinations that accomplish this aim, and of experimental verifications. Incorporation of the described filter combination makes it possible to use this relatively sensitive dosimeter for environmental radiation monitoring. A similar approach could be applied to any type of dosimeter in the form of a thin pellet or wafer. (author)

  5. Dissolution Model Development: Formulation Effects and Filter Complications

    DEFF Research Database (Denmark)

    Berthelsen, Ragna; Holm, Rene; Jacobsen, Jette

    2016-01-01

    This study describes various complications related to sample preparation (filtration) during development of a dissolution method intended to discriminate among different fenofibrate immediate-release formulations. Several dissolution apparatus and sample preparation techniques were tested. The fl....... With the tested drug–formulation combination, the best in vivo–in vitro correlation was found after filtration of the dissolution samples through 0.45-μm hydrophobic PTFE membrane filters....

  6. The development of a HEPA filter with improved dust holding characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J.; Hamblin, C.

    1995-02-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of {open_quotes}graded density{close_quotes} papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M{sup 3}h{sup {minus}1}) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts.

  7. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  8. DEVELOPMENT OF AN INNOVATIVE LASER SCANNER FOR GEOMETRICAL VERIFICATION OF METALLIC AND PLASTIC PARTS

    DEFF Research Database (Denmark)

    Carmignato, Simone; De Chiffre, Leonardo; Fisker, Rune

    2008-01-01

    and plastic parts. A first prototype of the novel measuring system has been developed, using laser triangulation. The system, besides ensuring the automatic reconstruction of complete surface models, has been designed to guarantee user-friendliness, versatility, reliability and speed. The paper focuses mainly...... on the metrological aspects of the system development. Details are given on procedures and artefacts developed for metrological performance verification and traceability establishment. Experimental results from measurements on metallic and plastic parts show that the system prototype is capable of performing...

  9. Working Group 2: Future Directions for Safeguards and Verification, Technology, Research and Development

    International Nuclear Information System (INIS)

    Zykov, S.; Blair, D.

    2013-01-01

    For traditional safeguards it was recognized that the hardware presently available is, in general, addressing adequately fundamental IAEA needs, and that further developments should therefore focus mainly on improving efficiencies (i.e. increasing cost economies, reliability, maintainability and user-friendliness, keeping abreast of continual advancements in technologies and of the evolution of verification approaches). Specific technology areas that could benefit from further development include: -) Non-destructive measurement systems (NDA), in particular, gamma-spectroscopy and neutron counting techniques; -) Containment and surveillance tools, such as tamper indicating seals, video-surveillance, surface identification methods, etc.; -) Geophysical methods for design information verification (DIV) and safeguarding of geological repositories; and -) New tools and methods for real-time monitoring. Furthermore, the Working Group acknowledged that a 'building block' (or modular) approach should be adopted towards technology development, enabling equipment to be upgraded efficiently as technologies advance. Concerning non-traditional safeguards, in the area of satellite-based sensors, increased spatial resolution and broadened spectral range were identified as priorities. In the area of wide area surveillance, the development of LIDAR-like tools for atmospheric sensing was discussed from the perspective of both potential benefits and certain limitations. Recognizing the limitations imposed by the human brain in terms of information assessment and analysis, technologies are needed that will enable the more effective utilization of all information, regardless of its format and origin. The paper is followed by the slides of the presentation. (A.C.)

  10. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  11. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  12. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  13. Development of a double-layered ceramic filter for aerosol filtration at high-temperatures: the filter collection efficiency.

    Science.gov (United States)

    de Freitas, Normanda L; Gonçalves, José A S; Innocentini, Murilo D M; Coury, José R

    2006-08-25

    The performance of double-layered ceramic filters for aerosol filtration at high temperatures was evaluated in this work. The filtering structure was composed of two layers: a thin granular membrane deposited on a reticulate ceramic support of high porosity. The goal was to minimize the high pressure drop inherent of granular structures, without decreasing their high collection efficiency for small particles. The reticulate support was developed using the technique of ceramic replication of polyurethane foam substrates of 45 and 75 pores per inch (ppi). The filtering membrane was prepared by depositing a thin layer of granular alumina-clay paste on one face of the support. Filters had their permeability and fractional collection efficiency analyzed for filtration of an airborne suspension of phosphatic rock in temperatures ranging from ambient to 700 degrees C. Results revealed that collection efficiency decreased with gas temperature and was enhanced with filtration time. Also, the support layer influenced the collection efficiency: the 75 ppi support was more effective than the 45 ppi. Particle collection efficiency dropped considerably for particles below 2 microm in diameter. The maximum collection occurred for particle diameters of approximately 3 microm, and decreased again for diameters between 4 and 8 microm. Such trend was successfully represented by the proposed correlation, which is based on the classical mechanisms acting on particle collection. Inertial impaction seems to be the predominant collection mechanism, with particle bouncing/re-entrainment acting as detachment mechanisms.

  14. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  15. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1992-01-01

    We have developed an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of HEPA filters under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Several prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan at 700 degrees F for five minutes. The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. using a water saturated air flow at 95 degrees F. For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter

  16. Development and verifications of fast reactor fuel design code ''Ceptar''

    International Nuclear Information System (INIS)

    Ozawa, T.; Nakazawa, H.; Abe, T.

    2001-01-01

    The annular fuel is very beneficial for fast reactors, because it is available for both high power and high burn-up. Concerning the irradiation behavior of the annular fuel, most of annular pellets irradiated up to high burn-up showed shrinkage of the central hole due to deformation and restructuring of the pellets. It is needed to predict precisely the shrinkage of the central hole during irradiation, because it has a great influence on power-to-melt. In this paper, outline of CEPTAR code (Calculation code to Evaluate fuel pin stability for annular fuel design) developed to meet this need is presented. In this code, the radial profile of fuel density can be computed by using the void migration model, and law of conservation of mass defines the inner diameter. For the mechanical analysis, the fuel and cladding deformation caused by the thermal expansion, swelling and creep is computed by the stress-strain analysis using the approximation of plane-strain. In addition, CEPTAR can also take into account the effect of Joint-Oxide-Gain (JOG) which is observed in fuel-cladding gap of high burn-up fuel. JOG has an effect to decrease the fuel swelling and to improve the gap conductance due to deposition of solid fission product. Based on post-irradiation data on PFR annular fuel, we developed an empirical model for JOG. For code verifications, the thermal and mechanical data obtained from various irradiation tests and post-irradiation examinations were compared with the predictions of this code. In this study, INTA (instrumented test assembly) test in JOYO, PTM (power-to-melt) test in JOYO, EBR-II, FFTF and MTR in Harwell laboratory, and post-irradiation examinations on a number of PFR fuels, were used as verification data. (author)

  17. On the Kalman Filter error covariance collapse into the unstable subspace

    Directory of Open Access Journals (Sweden)

    A. Trevisan

    2011-03-01

    Full Text Available When the Extended Kalman Filter is applied to a chaotic system, the rank of the error covariance matrices, after a sufficiently large number of iterations, reduces to N+ + N0 where N+ and N0 are the number of positive and null Lyapunov exponents. This is due to the collapse into the unstable and neutral tangent subspace of the solution of the full Extended Kalman Filter. Therefore the solution is the same as the solution obtained by confining the assimilation to the space spanned by the Lyapunov vectors with non-negative Lyapunov exponents. Theoretical arguments and numerical verification are provided to show that the asymptotic state and covariance estimates of the full EKF and of its reduced form, with assimilation in the unstable and neutral subspace (EKF-AUS are the same. The consequences of these findings on applications of Kalman type Filters to chaotic models are discussed.

  18. Whole-core thermal-hydraulic transient code development and verification for LMFBR analysis

    International Nuclear Information System (INIS)

    Spencer, D.R.

    1979-04-01

    Predicted performance during both steady state and transient reactor operation determines the steady state operating limits on LMFBRs. Unnecessary conservatism in performance predictions will not contribute to safety, but will restrict the reactor to more conservative, less economical steady state operation. The most general method for reducing analytical conservatism in LMFBR's without compromising safety is to develop, validate and apply more sophisticated computer models to the limiting performance analyses. The purpose of the on-going Natural Circulation Verification Program (NCVP) is to develop and validate computer codes to analyze natural circulation transients in LMFBRs, and thus, replace unnecessary analytical conservatism with demonstrated calculational capability

  19. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  20. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  1. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  2. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  3. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  4. The Development of a Microbial Challenge Test with Acholeplasma laidlawii To Rate Mycoplasma-Retentive Filters by Filter Manufacturers.

    Science.gov (United States)

    Folmsbee, Martha; Lentine, Kerry Roche; Wright, Christine; Haake, Gerhard; Mcburnie, Leesa; Ashtekar, Dilip; Beck, Brian; Hutchison, Nick; Okhio-Seaman, Laura; Potts, Barbara; Pawar, Vinayak; Windsor, Helena

    2014-01-01

    Mycoplasma are bacteria that can penetrate 0.2 and 0.22 μm rated sterilizing-grade filters and even some 0.1 μm rated filters. Primary applications for mycoplasma filtration include large scale mammalian and bacterial cell culture media and serum filtration. The Parenteral Drug Association recognized the absence of standard industry test parameters for testing and classifying 0.1 μm rated filters for mycoplasma clearance and formed a task force to formulate consensus test parameters. The task force established some test parameters by common agreement, based upon general industry practices, without the need for additional testing. However, the culture medium and incubation conditions, for generating test mycoplasma cells, varied from filter company to filter company and was recognized as a serious gap by the task force. Standardization of the culture medium and incubation conditions required collaborative testing in both commercial filter company laboratories and in an Independent laboratory (Table I). The use of consensus test parameters will facilitate the ultimate cross-industry goal of standardization of 0.1 μm filter claims for mycoplasma clearance. However, it is still important to recognize filter performance will depend on the actual conditions of use. Therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. Mycoplasma are small bacteria that have the ability to penetrate sterilizing-grade filters. Filtration of large-scale mammalian and bacterial cell culture media is an example of an industry process where effective filtration of mycoplasma is required. The Parenteral Drug Association recognized the absence of industry standard test parameters for evaluating mycoplasma clearance filters by filter manufacturers and formed a task force to formulate such a consensus among manufacturers. The use of standardized test parameters by filter manufacturers

  5. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  6. Development of a liquid filter testing technique using radioisotope

    International Nuclear Information System (INIS)

    Kumar, Surender; Ramarathinam, K.; Khan, A.A.

    1979-01-01

    Efficient removal of suspended matter from liquids was always in demand in industries as a process requirement for the recovery of suspended materials. In nuclear industry the filters are required to remove fine suspended matter from water in reactors, effluent treatment plants, fuel reprocessing plants etc. The filters are used to maintain clarity and to limit the activity level to a minimum. In effluent treatment plants low level liquid waste is discharged to the environment after removing active suspended matter by filters. Various type of liquid filters are available in the market to meet the demands of different industries. These filters must be evaluated for their removal effectiveness for particulate matter from liquids. The filters are evaluated using several techniques like gravimetric analysis, turbidity measurement, direct counting of particles using optical and electronic instruments etc. All the techniques have their own advantages and disadvantages. Counting of radioactive particles using radiation counters is a simple and sensitive technique. It involves the neutron activation of selected test powders which are dispersed in the liquid and led through the test filter; the up-stream and down-stream concentrations are measured using GM counter. This technique was found to be consistent and reproducible in the low, middle and high ranges of efficiency. Selection of a test powder, its activation and use for evaluating liquid filters are dealt with. (auth.)

  7. Development of nuclear safety class filter elements with long life and high quality

    International Nuclear Information System (INIS)

    Zhang Jinghua

    2009-04-01

    This paper describes the development on nuclear safety class filter elements with long life and high quality used for collecting radioactive contaminants, fragments of resin and impurities in primary systems of NPPs. The filter elements made of glass fibre elements are used for PWR, and of paper elements are used for PHWR. During the research, a series of tests for optimization were performed for selection of filter material and the improvement of binder. The flow rate and comprehensive performance have been measured in simulated conditions. The result shows that the application requirements for operational NPPs can be met, and the reliability and safety of the frame are also be verified. The comprehensive performance of the filter elements is equivalent to that of oversea similar products. The products have been used in NPPs in operation. (authors)

  8. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  9. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  10. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  11. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  12. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  14. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  15. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  16. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  17. Beam properties and stability of a flattening-filter free 7 MV beam--An overview

    International Nuclear Information System (INIS)

    Dzierma, Yvonne; Licht, Norbert; Nuesken, Frank; Ruebe, Christian

    2012-01-01

    Purpose: Several works have recently focused on flattening-filter-free (FFF) beams of linear accelerators of various companies (in particular, Varian and Elekta), but no overview as yet exists for the flattening-filter free 7XU beam (Siemens Artiste). Methods: Dosimetric properties of the 7XU beam were measured in May and September 2011. We present depth dose curves and beam profiles, output factors, and MLC transmission and assess the stability of the measurements. The 7XU beam was commissioned in the Pinnacle³ treatment planning system (TPS), and modeling results including the spectrum are presented. Results: The percent depth dose curve of the 7XU beam is similar to the flat 6X beam line, with a slightly smaller surface dose. The beam profiles show the characteristic shape of flattening-filter free beams, with deviations between measurements of generally less than 1%. The output factors of the 7XU beam decrease more slowly than for the 6X beam. The MLC transmission is comparable but slightly less for the 7XU beam. The 7XU beam can be adequately modeled by the Pinnacle³ TPS, with successful dosimetric verification. The spectrum of the 7XU beam has lower photon fluence up to approximately 2.5 MeV and higher fluence beyond, with a slightly higher mean energy. Conclusions: The 7XU beam has been commissioned for clinical use after successful modeling, stability checks, and dosimetric verification.

  18. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  19. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom

    2014-01-01

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases

  20. Neutron absorbers and detector types for spent fuel verification using the self-interrogation neutron resonance densitometry

    International Nuclear Information System (INIS)

    Rossa, Riccardo; Borella, Alessandro; Labeau, Pierre-Etienne; Pauly, Nicolas; Meer, Klaas van der

    2015-01-01

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive non-destructive assay (NDA) technique that is proposed for the direct measurement of 239 Pu in a spent fuel assembly. The insertion of neutron detectors wrapped with different neutron absorbing materials, or neutron filters, in the central guide tube of a PWR fuel assembly is envisaged to measure the neutron flux in the energy region close to the 0.3 eV resonance of 239 Pu. In addition, the measurement of the fast neutron flux is foreseen. This paper is focused on the determination of the Gd and Cd neutron filters thickness to maximize the detection of neutrons within the resonance region. Moreover, several detector types are compared to identify the optimal condition and to assess the expected total neutron counts that can be obtained with the SINRD measurements. Results from Monte Carlo simulations showed that ranges between 0.1–0.3 mm and 0.5–1.0 mm ensure the optimal conditions for the Gd and Cd filters, respectively. Moreover, a 239 Pu fission chamber is better suited to measure neutrons close to the 0.3 eV resonance and it has the highest sensitivity to 239 Pu, in comparison with a 235 U fission chamber, with a 3 He proportional counter, and with a 10 B proportional counter. The use of a thin Gd filter and a thick Cd filter is suggested for the 239 Pu and 235 U fission chambers to increase the total counts achieved in a measurement, while a thick Gd filter and a thin Cd filter are envisaged for the 3 He and 10 B proportional counters to increase the sensitivity to 239 Pu. We concluded that an optimization process that takes into account measurement time, filters thickness, and detector size is needed to develop a SINRD detector that can meet the requirement for an efficient verification of spent fuel assemblies

  1. Neutron absorbers and detector types for spent fuel verification using the self-interrogation neutron resonance densitometry

    Energy Technology Data Exchange (ETDEWEB)

    Rossa, Riccardo, E-mail: rrossa@sckcen.be [SCK-CEN, Belgian Nuclear Research Centre, Boeretang, 200, B2400 Mol (Belgium); Université libre de Bruxelles, Ecole polytechnique de Bruxelles, Service de Métrologie Nucléaire (CP 165/84), Avenue F.D. Roosevelt, 50, B1050 Brussels (Belgium); Borella, Alessandro, E-mail: aborella@sckcen.be [SCK-CEN, Belgian Nuclear Research Centre, Boeretang, 200, B2400 Mol (Belgium); Labeau, Pierre-Etienne, E-mail: pelabeau@ulb.ac.be [Université libre de Bruxelles, Ecole polytechnique de Bruxelles, Service de Métrologie Nucléaire (CP 165/84), Avenue F.D. Roosevelt, 50, B1050 Brussels (Belgium); Pauly, Nicolas, E-mail: nipauly@ulb.ac.be [Université libre de Bruxelles, Ecole polytechnique de Bruxelles, Service de Métrologie Nucléaire (CP 165/84), Avenue F.D. Roosevelt, 50, B1050 Brussels (Belgium); Meer, Klaas van der, E-mail: kvdmeer@sckcen.be [SCK-CEN, Belgian Nuclear Research Centre, Boeretang, 200, B2400 Mol (Belgium)

    2015-08-11

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive non-destructive assay (NDA) technique that is proposed for the direct measurement of {sup 239}Pu in a spent fuel assembly. The insertion of neutron detectors wrapped with different neutron absorbing materials, or neutron filters, in the central guide tube of a PWR fuel assembly is envisaged to measure the neutron flux in the energy region close to the 0.3 eV resonance of {sup 239}Pu. In addition, the measurement of the fast neutron flux is foreseen. This paper is focused on the determination of the Gd and Cd neutron filters thickness to maximize the detection of neutrons within the resonance region. Moreover, several detector types are compared to identify the optimal condition and to assess the expected total neutron counts that can be obtained with the SINRD measurements. Results from Monte Carlo simulations showed that ranges between 0.1–0.3 mm and 0.5–1.0 mm ensure the optimal conditions for the Gd and Cd filters, respectively. Moreover, a {sup 239}Pu fission chamber is better suited to measure neutrons close to the 0.3 eV resonance and it has the highest sensitivity to {sup 239}Pu, in comparison with a {sup 235}U fission chamber, with a {sup 3}He proportional counter, and with a {sup 10}B proportional counter. The use of a thin Gd filter and a thick Cd filter is suggested for the {sup 239}Pu and {sup 235}U fission chambers to increase the total counts achieved in a measurement, while a thick Gd filter and a thin Cd filter are envisaged for the {sup 3}He and {sup 10}B proportional counters to increase the sensitivity to {sup 239}Pu. We concluded that an optimization process that takes into account measurement time, filters thickness, and detector size is needed to develop a SINRD detector that can meet the requirement for an efficient verification of spent fuel assemblies.

  2. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  3. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  4. A 4 MV flattening filter-free beam: commissioning and application to conformal therapy and volumetric modulated arc therapy

    International Nuclear Information System (INIS)

    Stevens, S W; Rosser, K E; Bedford, J L

    2011-01-01

    Recent studies have indicated that radiotherapy treatments undertaken on a flattening filter-free (FFF) linear accelerator have a number of advantages over treatments undertaken on a conventional linear accelerator. In addition, 4 MV photon beams may give improved isodose coverage for some treatment volumes at air/tissue interfaces, compared to when utilizing the clinical standard of 6 MV photons. In order to investigate these benefits, FFF beams were established on an Elekta Beam Modulator linear accelerator for 4 MV photons. Commissioning beam data were obtained for open and wedged fields. The measured data were then imported into a treatment planning system and a beam model was commissioned. The beam model was optimized to improve dose calculations at shallow, clinically relevant depths. Following verification, the beam model was utilized in a treatment planning study, including volumetric modulated arc therapy, for a selection of lung, breast/chest wall and larynx patients. Increased dose rates of around 800 MU min -1 were recorded for open fields (relative to 320 MU min -1 for filtered open fields) and reduced head scatter was inferred from output factor measurements. Good agreement between planned and delivered dose was observed in verification of treatment plans. The planning study indicated that with a FFF beam, equivalent (and in some cases improved) isodose profiles could be achieved for small lung and larynx treatment volumes relative to 4 MV filtered treatments. Furthermore, FFF treatments with wedges could be replicated using open fields together with an 'effective wedge' technique and isocentre shift. Clinical feasibility of a FFF beam was therefore demonstrated, with beam modelling, treatment planning and verification being successfully accomplished.

  5. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  6. Baumot BA-B Diesel Particulate Filter with Pre-Catalyst (ETV Mobile Source Emissions Control Devices) Verification Report

    Science.gov (United States)

    The Baumot BA-B Diesel Particulate Filter with Pre-Catalyst is a diesel engine retrofit device for light, medium, and heavy heavy-duty diesel on-highway engines for use with commercial ultra-low-sulfur diesel (ULSD) fuel. The BA-B particulate filter is composed of a pre-catalyst ...

  7. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  8. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    already awash in fissile material and is increasingly threatened by the possible consequences of illicit trafficking in such material. The chemical field poses fewer problems. The ban on chemical weapons is a virtually complete post-Cold War regime, with state-of-the-art concepts and procedures of verification resulting from decades of negotiation. The detection of prohibited materials and activities is the common goal of the nuclear and chemical regimes for which the most intrusive and intensive procedures are activated by the three organizations. Accounting for the strictly peaceful application of dual-use items constitutes the bulk of the work of the inspectorates at the IAEA and the OPCW. A common challenge in both fields is the advance of science and technology in the vast nuclear and chemical industries and the ingenuity of some determined proliferators to deceive by concealing illicit activities under legitimate ones. Inspection procedures and technologies need to keep up with the requirement for flexibility and adaptation to change. The common objective of the three organizations is to assemble and analyze all relevant information in order to conclude reliably whether a State is or is not complying with its treaty obligations. The positive lessons learned from the IAEA's verification experience today are valuable in advancing concepts and technologies that might also benefit the other areas of WMD verification. Together with the emerging, more comprehensive verification practice of the OPCW, they may provide a useful basis for developing common standards, which may in turn help in evaluating the cost-effectiveness of verification methods for the Biological and Toxin Weapons Convention and other components of a WMD control regime

  9. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  10. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  11. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  12. The intractable cigarette 'filter problem'.

    Science.gov (United States)

    Harris, Bradford

    2011-05-01

    When lung cancer fears emerged in the 1950s, cigarette companies initiated a shift in cigarette design from unfiltered to filtered cigarettes. Both the ineffectiveness of cigarette filters and the tobacco industry's misleading marketing of the benefits of filtered cigarettes have been well documented. However, during the 1950s and 1960s, American cigarette companies spent millions of dollars to solve what the industry identified as the 'filter problem'. These extensive filter research and development efforts suggest a phase of genuine optimism among cigarette designers that cigarette filters could be engineered to mitigate the health hazards of smoking. This paper explores the early history of cigarette filter research and development in order to elucidate why and when seemingly sincere filter engineering efforts devolved into manipulations in cigarette design to sustain cigarette marketing and mitigate consumers' concerns about the health consequences of smoking. Relevant word and phrase searches were conducted in the Legacy Tobacco Documents Library online database, Google Patents, and media and medical databases including ProQuest, JSTOR, Medline and PubMed. 13 tobacco industry documents were identified that track prominent developments involved in what the industry referred to as the 'filter problem'. These reveal a period of intense focus on the 'filter problem' that persisted from the mid-1950s to the mid-1960s, featuring collaborations between cigarette producers and large American chemical and textile companies to develop effective filters. In addition, the documents reveal how cigarette filter researchers' growing scientific knowledge of smoke chemistry led to increasing recognition that filters were unlikely to offer significant health protection. One of the primary concerns of cigarette producers was to design cigarette filters that could be economically incorporated into the massive scale of cigarette production. The synthetic plastic cellulose acetate

  13. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  14. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  15. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  16. Inorganic UV filters

    Directory of Open Access Journals (Sweden)

    Eloísa Berbel Manaia

    2013-06-01

    Full Text Available Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  17. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  18. Development of off-gas filters for reprocessing plants. Development and construction of an off-gas filter system for large reprocessing plants. Off-gas section of the resolver test stand of the IHCh

    International Nuclear Information System (INIS)

    Furrer, J.; Kaempffer, R.; Wilhelm, J.G.; Pfauter, C.; Jannakos, K.; Apenberg, W.; Lange, W.; Mendel, W.; Potgeter, G.; Zabel, G.

    1976-01-01

    The test of the highly impregnated iodine sorption material AC 6,120 was continued in the laboratory under simulated conditions of a 1,500 t/a uranium reprocessing plant. The influence of NO in nitrogen as the carrier gas on the removal efficiency of the sorption material has been especially examined. Several experiments on the removal efficiency of iodine sorption by the material AC 6,120 were carried out in the original off-gas of the French processing plant SAP Marcoule while the filter system was installed on the one side directly behind the dissolver and on the other side behind the iodine desorption columm. The first iodine filter developed at LAF II was installed in the off-gas line of the dissolver in the Karlsruhe reprocessing plant. The filter system for the dissolver off-gas handling test rig of the IHCh was specified and ordered with an engineering firm. The conception of the prototype off-gas filter system was selected and a lock and transport system allowing to replace filters was designed and subjected for testing. Five alternative solutions were set up in order to find the appropriate filter concept. The method of selection based on the evaluation of performance criteria. According to the selected solution a filter drum was designed and constructed. The lock of the filter system has been designed and realized. Preliminary tests have been made. (orig.) [de

  19. On Applicability of Tunable Filter Bank Based Feature for Ear Biometrics: A Study from Constrained to Unconstrained.

    Science.gov (United States)

    Chowdhury, Debbrota Paul; Bakshi, Sambit; Guo, Guodong; Sa, Pankaj Kumar

    2017-11-27

    In this paper, an overall framework has been presented for person verification using ear biometric which uses tunable filter bank as local feature extractor. The tunable filter bank, based on a half-band polynomial of 14th order, extracts distinct features from ear images maintaining its frequency selectivity property. To advocate the applicability of tunable filter bank on ear biometrics, recognition test has been performed on available constrained databases like AMI, WPUT, IITD and unconstrained database like UERC. Experiments have been conducted applying tunable filter based feature extractor on subparts of the ear. Empirical experiments have been conducted with four and six subdivisions of the ear image. Analyzing the experimental results, it has been found that tunable filter moderately succeeds to distinguish ear features at par with the state-of-the-art features used for ear recognition. Accuracies of 70.58%, 67.01%, 81.98%, and 57.75% have been achieved on AMI, WPUT, IITD, and UERC databases through considering Canberra Distance as underlying measure of separation. The performances indicate that tunable filter is a candidate for recognizing human from ear images.

  20. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: PHASE 1-ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®

    Science.gov (United States)

    Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...

  2. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  3. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  4. Knowing How Good Our Searches Are: An Approach Derived from Search Filter Development Methodology

    Directory of Open Access Journals (Sweden)

    Sarah Hayman

    2015-12-01

    Full Text Available Objective – Effective literature searching is of paramount importance in supporting evidence based practice, research, and policy. Missed references can have adverse effects on outcomes. This paper reports on the development and evaluation of an online learning resource, designed for librarians and other interested searchers, presenting an evidence based approach to enhancing and testing literature searches. Methods – We developed and evaluated the set of free online learning modules for librarians called Smart Searching, suggesting the use of techniques derived from search filter development undertaken by the CareSearch Palliative Care Knowledge Network and its associated project Flinders Filters. The searching module content has been informed by the processes and principles used in search filter development. The self-paced modules are intended to help librarians and other interested searchers test the effectiveness of their literature searches, provide evidence of search performance that can be used to improve searches, as well as to evaluate and promote searching expertise. Each module covers one of four techniques, or core principles, employed in search filter development: (1 collaboration with subject experts; (2 use of a reference sample set; (3 term identification through frequency analysis; and (4 iterative testing. Evaluation of the resource comprised ongoing monitoring of web analytics to determine factors such as numbers of users and geographic origin; a user survey conducted online elicited qualitative information about the usefulness of the resource. Results – The resource was launched in May 2014. Web analytics show over 6,000 unique users from 101 countries (at 9 August 2015. Responses to the survey (n=50 indicated that 80% would recommend the resource to a colleague. Conclusions – An evidence based approach to searching, derived from search filter development methodology, has been shown to have value as an online learning

  5. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  6. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  7. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  8. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  9. The history of ceramic filters.

    Science.gov (United States)

    Fujishima, S

    2000-01-01

    The history of ceramic filters is surveyed. Included is the history of piezoelectric ceramics. Ceramic filters were developed using technology similar to that of quartz crystal and electro-mechanical filters. However, the key to this development involved the theoretical analysis of vibration modes and material improvements of piezoelectric ceramics. The primary application of ceramic filters has been for consumer-market use. Accordingly, a major emphasis has involved mass production technology, leading to low-priced devices. A typical ceramic filter includes monolithic resonators and capacitors packaged in unique configurations.

  10. Laboratory for filter testing

    Energy Technology Data Exchange (ETDEWEB)

    Paluch, W.

    1987-07-01

    Filters used for mine draining in brown coal surface mines are tested by the Mine Draining Department of Poltegor. Laboratory tests of new types of filters developed by Poltegor are analyzed. Two types of tests are used: tests of scale filter models and tests of experimental units of new filters. Design and operation of the test stands used for testing mechanical properties and hydraulic properties of filters for coal mines are described: dimensions, pressure fluctuations, hydraulic equipment. Examples of testing large-diameter filters for brown coal mines are discussed.

  11. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  12. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  13. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  14. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  15. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  16. Development of built-in debris-filter bottom nozzle for PWR fuel assemblies

    International Nuclear Information System (INIS)

    Juntaro Shimizu; Kazuki Monaka; Masaji Mori; Kazuo Ikeda

    2005-01-01

    Mitsubishi Heavy Industries, Ltd. (MHI) has worked to improve the capability of anti debris bottom nozzle for a PWR fuel assembly. The Current debris filter bottom nozzle (DFBN) having 4mm diameter flow holes can capture the larger size of debris than the flow hole inner diameter. MHI has completed the development of the built-in debris filter bottom nozzle, which is the new idea of the debris-filter for high burnup (55GWd/t assembly average burnup). Built-in debris filter bottom nozzle consists of the blades and nozzle body. The blades made from inconel strip are embedded and welded on the grooved top surface of the bottom nozzle adapter plate. A flow hole is divided by the blade and the trap size of the debris is reduced. Because the blades block the coolant flow, it was anticipated to increase the pressure loss of the nozzle, however, adjusting the relation between blade and taper shape of the flow hole, the pressure loss has been successfully maintained the satisfactory level. Grooves are cut on the nozzle plate; nevertheless, the additional skirts on the four sides of the nozzle compensate the structural strength. (authors)

  17. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  18. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  19. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  20. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Directory of Open Access Journals (Sweden)

    Byeong Hak Kim

    2017-12-01

    Full Text Available Unmanned aerial vehicles (UAVs are equipped with optical systems including an infrared (IR camera such as electro-optical IR (EO/IR, target acquisition and designation sights (TADS, or forward looking IR (FLIR. However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC and scene-based NUC (SBNUC. However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA. In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR and long wave infrared (LWIR images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  1. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  2. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  3. Development and evaluation of hot filters

    International Nuclear Information System (INIS)

    Thexton, H.E.

    1975-01-01

    High temperature, high flow filtration removes radioactive particles from the primary coolant, as well as inactive particles before they can become activated. Canadian experience with edge, graphite, and magnetic filters is described. (Author)

  4. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  5. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  6. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-04-30

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench

  7. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  8. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs

  9. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  10. Battery algorithm verification and development using hardware-in-the-loop testing

    Science.gov (United States)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  11. Battery algorithm verification and development using hardware-in-the-loop testing

    Energy Technology Data Exchange (ETDEWEB)

    He, Yongsheng [General Motors Global Research and Development, 30500 Mound Road, MC 480-106-252, Warren, MI 48090 (United States); Liu, Wei; Koch, Brain J. [General Motors Global Vehicle Engineering, Warren, MI 48090 (United States)

    2010-05-01

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO{sub 4}) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs. (author)

  12. The intractable cigarette ‘filter problem’

    Science.gov (United States)

    2011-01-01

    Background When lung cancer fears emerged in the 1950s, cigarette companies initiated a shift in cigarette design from unfiltered to filtered cigarettes. Both the ineffectiveness of cigarette filters and the tobacco industry's misleading marketing of the benefits of filtered cigarettes have been well documented. However, during the 1950s and 1960s, American cigarette companies spent millions of dollars to solve what the industry identified as the ‘filter problem’. These extensive filter research and development efforts suggest a phase of genuine optimism among cigarette designers that cigarette filters could be engineered to mitigate the health hazards of smoking. Objective This paper explores the early history of cigarette filter research and development in order to elucidate why and when seemingly sincere filter engineering efforts devolved into manipulations in cigarette design to sustain cigarette marketing and mitigate consumers' concerns about the health consequences of smoking. Methods Relevant word and phrase searches were conducted in the Legacy Tobacco Documents Library online database, Google Patents, and media and medical databases including ProQuest, JSTOR, Medline and PubMed. Results 13 tobacco industry documents were identified that track prominent developments involved in what the industry referred to as the ‘filter problem’. These reveal a period of intense focus on the ‘filter problem’ that persisted from the mid-1950s to the mid-1960s, featuring collaborations between cigarette producers and large American chemical and textile companies to develop effective filters. In addition, the documents reveal how cigarette filter researchers' growing scientific knowledge of smoke chemistry led to increasing recognition that filters were unlikely to offer significant health protection. One of the primary concerns of cigarette producers was to design cigarette filters that could be economically incorporated into the massive scale of cigarette

  13. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    Science.gov (United States)

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  14. The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.

    Science.gov (United States)

    National Evaluation Systems, Inc., Amherst, MA.

    National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…

  15. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  16. Tunable Multiband Microwave Photonic Filters

    Directory of Open Access Journals (Sweden)

    Mable P. Fok

    2017-11-01

    Full Text Available The increasing demand for multifunctional devices, the use of cognitive wireless technology to solve the frequency resource shortage problem, as well as the capabilities and operational flexibility necessary to meet ever-changing environment result in an urgent need of multiband wireless communications. Spectral filter is an essential part of any communication systems, and in the case of multiband wireless communications, tunable multiband RF filters are required for channel selection, noise/interference removal, and RF signal processing. Unfortunately, it is difficult for RF electronics to achieve both tunable and multiband spectral filtering. Recent advancements of microwave photonics have proven itself to be a promising candidate to solve various challenges in RF electronics including spectral filtering, however, the development of multiband microwave photonic filtering still faces lots of difficulties, due to the limited scalability and tunability of existing microwave photonic schemes. In this review paper, we first discuss the challenges that were facing by multiband microwave photonic filter, then we review recent techniques that have been developed to tackle the challenge and lead to promising developments of tunable microwave photonic multiband filters. The successful design and implementation of tunable microwave photonic multiband filter facilitate the vision of dynamic multiband wireless communications and radio frequency signal processing for commercial, defense, and civilian applications.

  17. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  18. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  19. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  20. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  1. Octennial History of the Development and Quality of High-Efficiency Filters for the US Atomic Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, H. [United States Atomic Energy Commission, Washington, DC (United States)

    1968-12-15

    Two facilities are operated for the US atomic energy commission in its program to ensure the uniform quality of commercially manufactured high-efficiency particulate filters. the filter-testing program was started in january 1960 after it was realized that the commercial fire-resistant product incorporated deficiencies of manufacture. the record of testing for quality assurance by the two facilities and an analysis of factors governing the quality of filters are presented. the analysis is complemented with a description of efforts, made in the course of the filter testing, to improve the design of the filter for efficiency and reliability. the fire-resistant (hepa) filter of 1959 was inadequate. the inadequacy of the filter, now judged by reflection, was brought about by the intensive accelerated efforts to replace and preclude, wherever possible in the US atomic energy program, use of filters made of combustible materials. this desire for fire resistance of filters has proliferated widely among other members of the international atomic energy family. the intensive AEC effort caused US industry to produce filters of fire-resistant design but without the opportunity for development of manufacturing technology adequate for ensuring reliability of the filter. the state of the fire-resistant filter today is in sharp contrast to the 1959 filter. the testing program, coupled with a program for continuing improvement of the filter, has resulted in the effective removal of radioactive aerosols at atomic energy installations on a consistent and dependable basis. (author)

  2. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  3. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  4. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  5. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  6. 24 CFR 1000.128 - Is income verification required for assistance under NAHASDA?

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Is income verification required for assistance under NAHASDA? 1000.128 Section 1000.128 Housing and Urban Development Regulations Relating to... § 1000.128 Is income verification required for assistance under NAHASDA? (a) Yes, the recipient must...

  7. MACCS2 development and verification efforts

    International Nuclear Information System (INIS)

    Young, M.; Chanin, D.

    1997-01-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the σ y and σ z plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses

  8. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  9. FULL SCALE REGENERABLE HEPA FILTER DESIGN USING SINTERED METAL FILTER ELEMENTS

    International Nuclear Information System (INIS)

    Gil Ramos; Kenneth Rubow; Ronald Sekellick

    2002-01-01

    A Department of Energy funded contract involved the development of porous metal as a HEPA filter, and the subsequent design of a full-scale regenerable HEPA filtration system (RHFS). This RHFS could replace the glass fiber HEPA filters currently being used on the high level waste (HLW) tank ventilation system with a system that would be moisture tolerant, durable, and cleanable in place. The origins of the contract are a 1996 investigation at the Savannah River Technology Center (SRTC) regarding the use of porous metal as a HEPA filter material. This contract was divided into Phases I, IIA and IIB. Phase I of the contract evaluated simple filter cylinders in a simulated High Level Waste (HLW) environment and the ability to clean and regenerate the filter media after fouling. Upon the successful completion of Phase I, Phase IIA was conducted, which included lab scale prototype testing and design of a full-scale system. The work completed under Phase IIA included development of a full-scale system design, development of a filter media meeting the HEPA filtration efficiency that would also be regenerable using prescribed cleaning procedures, and the testing of a single element system prototype at Savannah River. All contract objectives were met. The filter media selected was a nickel material already under development at Mott, which met the HEPA filtration efficiency standard. The Mott nickel media met and exceeded the HEPA requirement, providing 99.99% removal against a requirement of 99.97%. Double open-ended elements of this media were provided to the Savannah River Test Center for HLW simulation testing in the single element prototype filter. These elements performed well and further demonstrated the practicality of a metallic media regenerable HEPA filter system. An evaluation of the manufacturing method on many elements demonstrated the reproducibility to meet the HEPA filtration requirement. The full-scale design of the Mott RHFS incorporated several important

  10. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  11. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    done applying conventional methods where requirements and designs are described using natural language, diagrams and pseudo code, and the verification of requirements has been done by code inspection and non-exhaustive testing. These techniques are not sufficient, leading to errors and an in-effective...... for Strategic Research. The work is affiliated with a number of partners: DTU Compute, DTU Transport, DTU Management, DTU Fotonik, Bremen University, Banedanmark, Trafikstyrelsen, DSB, and DSB S-tog. More information about RobustRails project is available at http://www.dtu.dk/subsites/robustrails/English.aspx...

  12. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  13. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  14. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  15. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  16. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  17. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  18. Remotely operated top loading filter housing

    International Nuclear Information System (INIS)

    Ross, M.J.; Carter, J.A.

    1989-01-01

    A high-efficiency particulate air (HEPA) filter system was developed for the Fuel Processing Facility at the Idaho Chemical Processing Plant. The system utilizes commercially available HEPA filters and allows in-cell filters to be maintained using operator-controlled remote handling equipment. The remote handling tasks include transport of filters before and after replacement, removal and replacement of the filter from the housing, and filter containment

  19. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  20. Selection vector filter framework

    Science.gov (United States)

    Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.

    2003-10-01

    We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.

  1. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  2. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  3. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    Science.gov (United States)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  4. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: ADI INTERNATIONAL INC. ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®; PHASE II

    Science.gov (United States)

    Verification testing of the ADI International Inc. Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8, 2003 through May 28,...

  6. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  7. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  8. Development of the neutron filters for JET gamma-ray cameras

    International Nuclear Information System (INIS)

    Soare, S.; Curuia, M.; Anghel, M.; Constantin, M.; David, E.; Kiptily, V.; Prior, P.; Edlington, T.; Griph, S.; Krivchenkov, Y.; Popovichev, S.; Riccardo, V.; Syme, B; Thompson, V.; Murari, A.; Zoita, V.; Bonheure, G.; Le Guern

    2007-01-01

    The JET gamma-ray camera diagnostics have already provided valuable information on the gamma-ray imaging of fast ion evaluation in JET plasmas. The JET Gamma-Ray Cameras (GRC) upgrade project deals with the design of appropriate neutron/gamma-ray filters ('neutron attenuaters').The main design parameter was the neutron attenuation factor. The two design solutions, that have been finally chosen and developed at the level of scheme design, consist of: a) one quasi-crescent shaped neutron attenuator (for the horizontal camera) and b) two quasi-trapezoid shaped neutron attenuators (for the vertical one). Various neutron-attenuating materials have been considered (lithium hydride with natural isotopic composition and 6 Li enriched, light and heavy water, polyethylene). Pure light water was finally chosen as the attenuating material for the JET gamma-ray cameras. FEA methods used to evaluate the behaviour of the filter casings under the loadings (internal hydrostatic pressure, torques) have proven the stability of the structure. (authors)

  9. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  10. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  11. Development of aerosol decontamination factor evaluation method for filtered containment venting system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae bong; Kim, Sung Il; Jung, Jaehoon; Ha, Kwang Soon; Kim, Hwan Yeol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Fission products would be released from molten corium pools which are relocated into the lower plenum of reactor pressure vessel, on the concrete pit and in the core catcher. In addition, steam, hydrogen and noncondensable gases such as CO and CO2 are generated during the core damage progression due to loss of coolant and the molten core-concrete interaction. Consequently, the pressure inside the containment could be increased continuously. Filtered containment venting is one action to prevent an uncontrolled release of radioactive fission products caused by an overpressure failure of the containment. After the Fukushima-Daiichi accident which was demonstrated the containment failure, many countries to consider the implementation of filtered containment venting system(FCVS) on nuclear power plant where these are not currently applied. In general evaluation for FCVS is conducted to determine decontamination factor on several conditions (aerosol diameter, submergence depth, water temperature, gas flow, steam flow rate, pressure, operating time,...). It is essential to quantify the mass concentration before and after FCVS for decontamination factor. This paper presents the development of the evaluation facility for filtered containment venting system at KAERI and an experimental investigation for aerosol removal performance. Decontamination factor for the FCVS is determined by filter measurement. The result of the aerosol size distribution measurement shows the aerosol removal performance by an aerosol size.

  12. Further development of the cleanable steel HEPA filter, cost/benefit analysis, and comparison with competing technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Lopez, R.; Wilson, K. [Lawrence Livermore National Lab., CA (United States)] [and others

    1997-08-01

    We have made further progress in developing a cleanable steel fiber HEPA filter. We fabricated a pleated cylindrical cartridge using commercially available steel fiber media that is made with 1 {mu}m stainless steel fibers and sintered into a sheet form. Test results at the Department of Energy (DOE) Filter Test Station at Oak Ridge show the prototype filter cartridge has 99.99% efficiency for 0.3 {mu}m dioctyl phthalate (DOP) aerosols and a pressure drop of 1.5 inches. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned using reverse air pulses. Our analysis of commercially optimized filters suggest that cleanable steel HEPA filters need to be made from steel fibers less than 1{mu}m, and preferably 0.5 {mu}m, to meet the standard HEPA filter requirements in production units. We have demonstrated that 0.5 {mu}m steel fibers can be produced using the fiber bundling and drawing process. The 0.5 {mu}m steel fibers are then sintered into small filter samples and tested for efficiency and pressure drop. Test results on the sample showed a penetration of 0.0015 % at 0.3 {mu}m and a pressure drop of 1.15 inches at 6.9 ft/min (3.5 cm/s) velocity. Based on these results, steel fiber media can easily meet the requirements of 0.03 % penetration and 1.0 inch of pressure drop by using less fibers in the media. A cost analysis of the cleanable steel HEPA filter shows that, although the steel HEPA filter costs much more than the standard glass fiber HEPA filter, it has the potential to be very cost effective because of the high disposal costs of contaminated HEPA filters. We estimate that the steel HEPA filter will save an average of $16,000 over its 30 year life. The additional savings from the clean-up costs resulting from ruptured glass HEPA filters during accidents was not included but makes the steel HEPA filter even more cost effective. 33 refs., 28 figs., 1 tab.

  13. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  14. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  15. In-situ testing of HEPA filters in the nuclear Karlsruhe filter system

    International Nuclear Information System (INIS)

    Ohlmeyer, M.; Stotz, W.

    1977-01-01

    Nuclear plant operators and filter manufacturers are endeavouring to improve environmental protection by intensifying process control and/or improving filter quality. In-situ testing is an important element in these efforts since it represents a direct means of checking the success or otherwise of a particular development. The arrangements for in-situ testing should satisfy the following minimum requirements: the staff should not be exposed to risk during the test; the test method should be objective and reproducible as well as being as sensitive as possible; the test method should permit detection of individual leaks in the filter system so that they can be remedied as efficiently as possible; the test equipment should not necessitate modifications to the extract systems or plant construction; the test should be simple and capable of being carried out with a minimum of effort and equipment. GfK has developed the 'Nuclear-Karlsruhe' filter housing in accordance with these principles. This housing permits in-situ testing similar to the DIN 24184 visual oil-fog test or the DOP test. External visual checks on the general condition of the filter is also possible. A safe system of filter changing with a specially designed plastic bag attachment at an accessible height considerably increases the degree of protection of operating personnel

  16. Development of DC active filter for high magnetic field stable power supply

    International Nuclear Information System (INIS)

    Wang Lei; Liu Xiaoning

    2008-01-01

    The DC active filter (DAF), with very low current ripple, of the stable power supply system of high magnetic field device is developed by using the PWM and parallel active power filter technique. Due to the PWM control technique, the required DAF current can be obtained and the current ripple can be compensated by means of monitoring the load voltage, and the current ripple becomes very low by adjusting the load voltage. The simulation and analysis show that this system can respond to the reference quickly and is effective in suppressing the harmonics, especially the low-order harmonics. The feasibility of the proposed scheme is proved on the equipment built in the laboratory. (authors)

  17. OPTIMIZATION OF ADVANCED FILTER SYSTEMS; TOPICAL

    International Nuclear Information System (INIS)

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-01-01

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench-scale test program has also been developed based

  18. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  19. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  20. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2009-01-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electrolyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  1. Filters in nuclear facilities

    International Nuclear Information System (INIS)

    Berg, K.H.; Wilhelm, J.G.

    1985-01-01

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.) [de

  2. Nanofiber Filters Eliminate Contaminants

    Science.gov (United States)

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  3. Verification and validation as an integral part of the development of digital systems for nuclear applications

    International Nuclear Information System (INIS)

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  4. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  5. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  6. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  7. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Science.gov (United States)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  8. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    Science.gov (United States)

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  9. Development of laundry drainage treatment system with ceramic ultra filter

    International Nuclear Information System (INIS)

    Kanda, Masanori; Kurahasi, Takafumi

    1995-01-01

    A compact laundry drainage treatment system (UF system hereafter) with a ceramic ultra filter membrane (UF membrane hereafter) has been developed to reduce radioactivity in laundry drainage from nuclear power plants. The UF membrane is made of sintered fine ceramic. The UF membrane has 0.01 μm fine pores, resulting in a durable, heat-resistant, and corrosion-resistant porous ceramic filter medium. A cross-flow system, laundry drainage is filtrated while it flows across the UF membrane, is used as the filtration method. This method creates less caking when compared to other methods. The UF membrane is back washed at regular intervals with permeated water to minimize caking of the filter. The UF membrane and cross-flow system provides long stable filtration. The ceramic UF membrane is strong enough to concentrate suspended solids in laundry drainage up to a weight concentration of 10%. The final concentrated laundry drainage can be treated in an incinerator. The performance of the UF system was checked using radioactive laundry drainage. The decontamination factor of the UF system was 25 or more. The laundry drainage treatment capacity and concentration ratio of the UF system, as well as the service life of the UF membrane were also checked by examination using simulated non-radioactive laundry drainage. Even though laundry drainage was concentrated 1000 times, the UF system showed good permeated water quality and permeated water flux. (author)

  10. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  11. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  12. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  13. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  14. HEPA Filter Vulnerability Assessment

    International Nuclear Information System (INIS)

    GUSTAVSON, R.D.

    2000-01-01

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  15. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  16. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  17. High efficiency steel filters for nuclear air cleaning

    International Nuclear Information System (INIS)

    Bergman, W.; Conner, J.; Larsen, G.; Lopez, R.; Turner, C.; Vahla, G.; Violet, C.; Williams, K.

    1991-01-01

    The authors have, in cooperation with industry, developed high-efficiency filters made from sintered stainless-steel fibers for use in several air-cleaning applications in the nuclear industry. These filters were developed to overcome the failure modes in present high-efficiently particulate air (HEPA) filters. HEPA filters are made from glass paper and glue, and they may fail when they get hot or wet and when they are overpressured. In developing steel filters, they first evaluated the commercially available stainless-steel filter media made from sintered powder and sintered fiber. The sintered-fiber media performed much better than sintered-powder media, and the best media had the smallest fiber diameter. Using the best media, prototype filters were then built for venting compressed gases and evaluated in their automated filter tester

  18. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  19. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  20. Enhanced Bank of Kalman Filters Developed and Demonstrated for In-Flight Aircraft Engine Sensor Fault Diagnostics

    Science.gov (United States)

    Kobayashi, Takahisa; Simon, Donald L.

    2005-01-01

    In-flight sensor fault detection and isolation (FDI) is critical to maintaining reliable engine operation during flight. The aircraft engine control system, which computes control commands on the basis of sensor measurements, operates the propulsion systems at the demanded conditions. Any undetected sensor faults, therefore, may cause the control system to drive the engine into an undesirable operating condition. It is critical to detect and isolate failed sensors as soon as possible so that such scenarios can be avoided. A challenging issue in developing reliable sensor FDI systems is to make them robust to changes in engine operating characteristics due to degradation with usage and other faults that can occur during flight. A sensor FDI system that cannot appropriately account for such scenarios may result in false alarms, missed detections, or misclassifications when such faults do occur. To address this issue, an enhanced bank of Kalman filters was developed, and its performance and robustness were demonstrated in a simulation environment. The bank of filters is composed of m + 1 Kalman filters, where m is the number of sensors being used by the control system and, thus, in need of monitoring. Each Kalman filter is designed on the basis of a unique fault hypothesis so that it will be able to maintain its performance if a particular fault scenario, hypothesized by that particular filter, takes place.

  1. Filtering Photogrammetric Point Clouds Using Standard LIDAR Filters Towards DTM Generation

    Science.gov (United States)

    Zhang, Z.; Gerke, M.; Vosselman, G.; Yang, M. Y.

    2018-05-01

    Digital Terrain Models (DTMs) can be generated from point clouds acquired by laser scanning or photogrammetric dense matching. During the last two decades, much effort has been paid to developing robust filtering algorithms for the airborne laser scanning (ALS) data. With the point cloud quality from dense image matching (DIM) getting better and better, the research question that arises is whether those standard Lidar filters can be used to filter photogrammetric point clouds as well. Experiments are implemented to filter two dense matching point clouds with different noise levels. Results show that the standard Lidar filter is robust to random noise. However, artefacts and blunders in the DIM points often appear due to low contrast or poor texture in the images. Filtering will be erroneous in these locations. Filtering the DIM points pre-processed by a ranking filter will bring higher Type II error (i.e. non-ground points actually labelled as ground points) but much lower Type I error (i.e. bare ground points labelled as non-ground points). Finally, the potential DTM accuracy that can be achieved by DIM points is evaluated. Two DIM point clouds derived by Pix4Dmapper and SURE are compared. On grassland dense matching generates points higher than the true terrain surface, which will result in incorrectly elevated DTMs. The application of the ranking filter leads to a reduced bias in the DTM height, but a slightly increased noise level.

  2. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  3. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  4. Development of activated charcoal impregnated air sampling filter media : their characteristics and use

    International Nuclear Information System (INIS)

    Khan, A.A.; Ramarathinam, K.; Gupta, S.K.; Deshingkar, D.S.; Kishore, A.G.

    1975-01-01

    Because of its low maximum permissible concentration in air, air-borne radioiodine must be accurately monitored in contaminated air streams, in the working environment and handling facilities, before release to the environment from the nuclear facilities. Activated charcoal impregnated air sampling filter media are found to be most suitable for monitoring airborne iodine-131. Because of its simplicity and reproducible nature in assessment of air-borne radioactive iodine, the work on the development of such media was undertaken in order to find a suitable substitute for imported activated charcoal impregnated air sampling filter media. Eight different media of such type were developed, evaluated and compared with two imported media. Best suitable medium is recommended for its use in air-borne iodine sampling which was found to be even better suited than imported media of such type. (author)

  5. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  6. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  7. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    International Nuclear Information System (INIS)

    Zhao, J; Hu, W; Xing, Y; Wu, X; Li, Y

    2016-01-01

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  8. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, J; Hu, W [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China); Xing, Y [Fudan univercity shanghai proton and heavy ion center, Shanghai (China); Wu, X [Fudan university shanghai proton and heavy ion center, Shanghai, shagnhai (China); Li, Y [Department of Medical physics at Shanghai Proton and Heavy Ion Center, Shanghai, Shanghai (China)

    2016-06-15

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  9. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  10. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  11. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  12. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  13. Box-particle intensity filter

    OpenAIRE

    Schikora, Marek; Gning, Amadou; Mihaylova, Lyudmila; Cremers, Daniel; Koch, Wofgang; Streit, Roy

    2012-01-01

    This paper develops a novel approach for multi-target tracking, called box-particle intensity filter (box-iFilter). The approach is able to cope with unknown clutter, false alarms and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic and data association uncertainty. The box-iFilter reduces the number of particles significantly, which improves the runtime considerably. The low particle number enables thi...

  14. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  15. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  16. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  17. Top-down design and verification methodology for analog mixed-signal integrated circuits

    NARCIS (Netherlands)

    Beviz, P.

    2016-01-01

    The current report contains the introduction of a novel Top-Down Design and Verification methodology for AMS integrated circuits. With the introduction of new design and verification flow, more reliable and efficient development of AMS ICs is possible. The assignment incorporated the research on the

  18. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  19. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  20. Development and testing of a medline search filter for identifying patient and public involvement in health research.

    Science.gov (United States)

    Rogers, Morwenna; Bethel, Alison; Boddy, Kate

    2017-06-01

    Research involving the public as partners often proves difficult to locate due to the variations in terms used to describe public involvement, and inability of medical databases to index this concept effectively. To design a search filter to identify literature where patient and public involvement (PPI) was used in health research. A reference standard of 172 PPI papers was formed. The references were divided into a development set and a test set. Search terms were identified from common words, phrases and synonyms in the development set. These terms were combined as a search strategy for medline via OvidSP, which was then tested for sensitivity against the test set. The resultant search filter was then assessed for sensitivity, specificity and precision using a previously published systematic review. The search filter was found to be highly sensitive 98.5% in initial testing. When tested against results generated by a 'real-life' systematic review, the filter had a specificity of 81%. However, sensitivity dropped to 58%. Adjustments to the population group of terms increased the sensitivity to 73%. The PPI filter designed for medline via OvidSP could aid information specialists and researchers trying to find literature specific to PPI. © 2016 Health Libraries Group.

  1. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  2. Hybrid Filter Membrane

    Science.gov (United States)

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of

  3. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  4. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  5. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  6. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  7. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  8. Automated synthesis and verification of configurable DRAM blocks for ASIC's

    Science.gov (United States)

    Pakkurti, M.; Eldin, A. G.; Kwatra, S. C.; Jamali, M.

    1993-01-01

    A highly flexible embedded DRAM compiler is developed which can generate DRAM blocks in the range of 256 bits to 256 Kbits. The compiler is capable of automatically verifying the functionality of the generated DRAM modules. The fully automated verification capability is a key feature that ensures the reliability of the generated blocks. The compiler's architecture, algorithms, verification techniques and the implementation methodology are presented.

  9. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  10. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  11. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  12. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  13. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  14. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  15. Development of fast charge exchange recombination spectroscopy by using interference filter method in JT-60U

    International Nuclear Information System (INIS)

    Kobayashi, Shinji; Sakasai, Akira; Koide, Yoshihiko; Sakamoto, Yoshiteru; Kamada, Yutaka; Hatae, Takaki; Oyama, Naoyuki; Miura, Yukitoshi

    2003-01-01

    Recent developments and results of fast charge exchange recombination spectroscopy (CXRS) using interference filter method are reported. In order to measure the rapid change of the ion temperature and rotation velocity under collapse or transition phenomena with high-time resolution, two types of interference filter systems were applied to the CXRS diagnostics on the JT-60U Tokamak. One can determine the Doppler broadening and Doppler shift of the CXR emission using three interference filters having slightly different center wavelengths. A rapid estimation method of the temperature ad rotation velocity without non-linear least square fitting is presented. The modification of the three-filters system enables us to improve the minimum time resolution up to 0.8 ms, which is better than that of 16.7 ms for the conventional CXRS system using the CCD detector in JT-60U. The other system having seven wavelength channels is newly fabricated to crosscheck the results obtained by the three-filters assembly, that is, to verify that the CXR emission forms a Gaussian profile under collapse phenomena. In a H-mode discharge having giant edge localized modes, the results obtained by the two systems are compared. The applicability of the three-filters system to the measurement of rapid changes in temperature and rotation velocity is demonstrated. (author)

  16. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  17. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  18. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  19. Analog filters in nanometer CMOS

    CERN Document Server

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  20. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  1. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  2. Graph-based specification and verification for aspect-oriented languages

    NARCIS (Netherlands)

    Staijen, T.

    2010-01-01

    Aspect-oriented software development aims at improving separation of concerns at all levels in the software development life-cycle, from architecture to code implementation. In this thesis we strive to develop verification methods specifically for aspect-oriented programming languages. For this

  3. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  4. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  5. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  6. Comparison of cryogenic low-pass filters

    Science.gov (United States)

    Thalmann, M.; Pernau, H.-F.; Strunk, C.; Scheer, E.; Pietsch, T.

    2017-11-01

    Low-temperature electronic transport measurements with high energy resolution require both effective low-pass filtering of high-frequency input noise and an optimized thermalization of the electronic system of the experiment. In recent years, elaborate filter designs have been developed for cryogenic low-level measurements, driven by the growing interest in fundamental quantum-physical phenomena at energy scales corresponding to temperatures in the few millikelvin regime. However, a single filter concept is often insufficient to thermalize the electronic system to the cryogenic bath and eliminate spurious high frequency noise. Moreover, the available concepts often provide inadequate filtering to operate at temperatures below 10 mK, which are routinely available now in dilution cryogenic systems. Herein we provide a comprehensive analysis of commonly used filter types, introduce a novel compact filter type based on ferrite compounds optimized for the frequency range above 20 GHz, and develop an improved filtering scheme providing adaptable broad-band low-pass characteristic for cryogenic low-level and quantum measurement applications at temperatures down to few millikelvin.

  7. Comparison of cryogenic low-pass filters.

    Science.gov (United States)

    Thalmann, M; Pernau, H-F; Strunk, C; Scheer, E; Pietsch, T

    2017-11-01

    Low-temperature electronic transport measurements with high energy resolution require both effective low-pass filtering of high-frequency input noise and an optimized thermalization of the electronic system of the experiment. In recent years, elaborate filter designs have been developed for cryogenic low-level measurements, driven by the growing interest in fundamental quantum-physical phenomena at energy scales corresponding to temperatures in the few millikelvin regime. However, a single filter concept is often insufficient to thermalize the electronic system to the cryogenic bath and eliminate spurious high frequency noise. Moreover, the available concepts often provide inadequate filtering to operate at temperatures below 10 mK, which are routinely available now in dilution cryogenic systems. Herein we provide a comprehensive analysis of commonly used filter types, introduce a novel compact filter type based on ferrite compounds optimized for the frequency range above 20 GHz, and develop an improved filtering scheme providing adaptable broad-band low-pass characteristic for cryogenic low-level and quantum measurement applications at temperatures down to few millikelvin.

  8. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    applications, this implementation forms the basis of a verification technique for imperative programs that perform data-type operations using pointers. To achieve this, the basic logic is extended with layers of language abstractions. Also, a language for expressing data structures and operations along...

  9. Development of an adaptive bilateral filter for evaluating color image difference

    Science.gov (United States)

    Wang, Zhaohui; Hardeberg, Jon Yngve

    2012-04-01

    Spatial filtering, which aims to mimic the contrast sensitivity function (CSF) of the human visual system (HVS), has previously been combined with color difference formulae for measuring color image reproduction errors. These spatial filters attenuate imperceptible information in images, unfortunately including high frequency edges, which are believed to be crucial in the process of scene analysis by the HVS. The adaptive bilateral filter represents a novel approach, which avoids the undesirable loss of edge information introduced by CSF-based filtering. The bilateral filter employs two Gaussian smoothing filters in different domains, i.e., spatial domain and intensity domain. We propose a method to decide the parameters, which are designed to be adaptive to the corresponding viewing conditions, and the quantity and homogeneity of information contained in an image. Experiments and discussions are given to support the proposal. A series of perceptual experiments were conducted to evaluate the performance of our approach. The experimental sample images were reproduced with variations in six image attributes: lightness, chroma, hue, compression, noise, and sharpness/blurriness. The Pearson's correlation values between the model-predicted image difference and the observed difference were employed to evaluate the performance, and compare it with that of spatial CIELAB and image appearance model.

  10. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  11. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  12. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  13. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  14. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  15. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  16. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Clausen, Geo; Weschler, Charles J.

    2008-01-01

    by an upstream pre-filter (changed monthly), an EU7 filter protected by an upstream activated carbon (AC) filter, and EU7 filters with an AC filter either downstream or both upstream and downstream. In addition, two types of stand-alone combination filters were evaluated: a bag-type fiberglass filter...... that contained AC and a synthetic fiber cartridge filter that contained AC. Air that had passed through used filters was most acceptable for those sets in which an AC filter was used downstream of the particle filter. Comparable air quality was achieved with the stand-alone bag filter that contained AC...

  17. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  18. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  19. The high efficiency steel filters for nuclear air cleaning

    International Nuclear Information System (INIS)

    Bergman, W.; Larsen, G.; Lopez, R.; Williams, K.; Violet, C.

    1990-08-01

    We have, in cooperation with industry, developed high-efficiency filters made from sintered stainless-steel fibers for use in several air-cleaning applications in the nuclear industry. These filters were developed to overcome the failure modes in present high-efficiency particulate air (HEPA) filters. HEPA filters are made from glass paper and glue, and they may fail when they get hot or wet and when they are overpressured. In developing our steel filters, we first evaluated the commercially available stainless-steel filter media made from sintered powder and sintered fiber. The sintered-fiber media performed much better than sintered-powder media, and the best media had the smallest fiber diameter. Using the best media, we then built prototype filters for venting compressed gases and evaluated them in our automated filter tester. 12 refs., 20 figs

  20. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  1. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  2. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  3. Simulation for noise cancellation using LMS adaptive filter

    Science.gov (United States)

    Lee, Jia-Haw; Ooi, Lu-Ean; Ko, Ying-Hao; Teoh, Choe-Yung

    2017-06-01

    In this paper, the fundamental algorithm of noise cancellation, Least Mean Square (LMS) algorithm is studied and enhanced with adaptive filter. The simulation of the noise cancellation using LMS adaptive filter algorithm is developed. The noise corrupted speech signal and the engine noise signal are used as inputs for LMS adaptive filter algorithm. The filtered signal is compared to the original noise-free speech signal in order to highlight the level of attenuation of the noise signal. The result shows that the noise signal is successfully canceled by the developed adaptive filter. The difference of the noise-free speech signal and filtered signal are calculated and the outcome implies that the filtered signal is approaching the noise-free speech signal upon the adaptive filtering. The frequency range of the successfully canceled noise by the LMS adaptive filter algorithm is determined by performing Fast Fourier Transform (FFT) on the signals. The LMS adaptive filter algorithm shows significant noise cancellation at lower frequency range.

  4. Noise reduction with complex bilateral filter.

    Science.gov (United States)

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  5. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  6. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  7. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  8. SoC Design Approach Using Convertibility Verification

    Directory of Open Access Journals (Sweden)

    Basu Samik

    2008-01-01

    Full Text Available Abstract Compositional design of systems on chip from preverified components helps to achieve shorter design cycles and time to market. However, the design process is affected by the issue of protocol mismatches, where two components fail to communicate with each other due to protocol differences. Convertibility verification, which involves the automatic generation of a converter to facilitate communication between two mismatched components, is a collection of techniques to address protocol mismatches. We present an approach to convertibility verification using module checking. We use Kripke structures to represent protocols and the temporal logic to describe desired system behavior. A tableau-based converter generation algorithm is presented which is shown to be sound and complete. We have developed a prototype implementation of the proposed algorithm and have used it to verify that it can handle many classical protocol mismatch problems along with SoC problems. The initial idea for -based convertibility verification was presented at SLA++P '07 as presented in the work by Roopak Sinha et al. 2008.

  9. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  10. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  11. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  12. Development of regeneration technique for diesel particulate filter made of porous metal; Kinzoku takotai DPF no saisei gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Yoro, K; Ban, S; Ooka, T; Saito, H; Oji, M; Nakajima, S; Okamoto, S [Sumitomo Electric Industries, Ltd., Osaka (Japan)

    1997-10-01

    We have developed the diesel particulate filter (DPF) in which porous metal is used for a filter because of its high thermal conductivity and a radiation heater is used for a regeneration device because of its uniform thermal distribution. In the case high trapping efficiency is required, filter thickness should be thick. The thicker filter has a disadvantage of difficulty in regeneration because of the thermal distribution in the direction of thickness. In order to improve regeneration efficiency, we designed the best filter-heater construction which achieves uniform thermal distribution by using computer simulation and we confirmed good regeneration efficiency in the experiment. 4 refs., 14 figs., 1 tab.

  13. Development of biomass in a drinking water granular active carbon (GAC) filter.

    Science.gov (United States)

    Velten, Silvana; Boller, Markus; Köster, Oliver; Helbing, Jakob; Weilenmann, Hans-Ulrich; Hammes, Frederik

    2011-12-01

    Indigenous bacteria are essential for the performance of drinking water biofilters, yet this biological component remains poorly characterized. In the present study we followed biofilm formation and development in a granular activated carbon (GAC) filter on pilot-scale during the first six months of operation. GAC particles were sampled from four different depths (10, 45, 80 and 115 cm) and attached biomass was measured with adenosine tri-phosphate (ATP) analysis. The attached biomass accumulated rapidly on the GAC particles throughout all levels in the filter during the first 90 days of operation and maintained a steady state afterward. Vertical gradients of biomass density and growth rates were observed during start-up and also in steady state. During steady state, biomass concentrations ranged between 0.8-1.83 x 10(-6) g ATP/g GAC in the filter, and 22% of the influent dissolved organic carbon (DOC) was removed. Concomitant biomass production was about 1.8 × 10(12) cells/m(2)h, which represents a yield of 1.26 × 10(6) cells/μg. The bacteria assimilated only about 3% of the removed carbon as biomass. At one point during the operational period, a natural 5-fold increase in the influent phytoplankton concentration occurred. As a result, influent assimilable organic carbon concentrations increased and suspended bacteria in the filter effluent increased 3-fold as the direct consequence of increased growth in the biofilter. This study shows that the combination of different analytical methods allows detailed quantification of the microbiological activity in drinking water biofilters. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  15. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  16. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  17. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  18. Spectral filtering for plant production

    Energy Technology Data Exchange (ETDEWEB)

    Young, R.E.; McMahon, M.J.; Rajapakse, N.C.; Becoteau, D.R.

    1994-12-31

    Research to date suggests that spectral filtering can be an effective alternative to chemical growth regulators for altering plant development. If properly implemented, it can be nonchemical and environmentally friendly. The aqueous CuSO{sub 4}, and CuCl{sub 2} solutions in channelled plastic panels have been shown to be effective filters, but they can be highly toxic if the solutions contact plants. Some studies suggest that spectral filtration limited to short EOD intervals can also alter plant development. Future research should be directed toward confirmation of the influence of spectral filters and exposure times on a broader range of plant species and cultivars. Efforts should also be made to identify non-noxious alternatives to aqueous copper solutions and/or to incorporate these chemicals permanently into plastic films and panels that can be used in greenhouse construction. It would also be informative to study the impacts of spectral filters on insect and microbal populations in plant growth facilities. The economic impacts of spectral filtering techniques should be assessed for each delivery methodology.

  19. Dedusting and filtering technology; Entstaubungs- und Filtertechnik

    Energy Technology Data Exchange (ETDEWEB)

    Selck, S.; Stockmann, H.W.; Both, R. [Deutsche Montan Technologie GmbH, Essen (Germany). Gas and Fire Div.

    2004-07-01

    For the further development of the filtration and dedusting technology within the last research period the new regulations in occupational hygiene concerning dust as well as ISO and EN standards have been considered. Also the new requirements concerning fire and explosion protection filter materials based in the test regulations for synthetic materials have been taken into account. The adoption of these new regulations inhibits the further use of the available high effective filter materials in underground coal mines. The development of new filter materials has been forced by the test regulations for synthetic materials, as the specific aspects of electrostatic behaviour, soot and toxic gases formed by burning of filter materials impacting the CO self rescue filters, have been taken into account. Even these requirements are partially inhibiting high filter efficiencies and air flows, all the requirements have been fulfilled on a high level on filter efficiencies matching the present state of art in occupational hygiene as reported in the Silicosis Reports Vol. 20 and 21. (orig.)

  20. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  1. Avoiding the Use of Exhausted Drinking Water Filters: A Filter-Clock Based on Rusting Iron

    Directory of Open Access Journals (Sweden)

    Arnaud Igor Ndé-Tchoupé

    2018-05-01

    Full Text Available Efficient but affordable water treatment technologies are currently sought to solve the prevalent shortage of safe drinking water. Adsorption-based technologies are in the front-line of these efforts. Upon proper design, universally applied materials (e.g., activated carbons, bone chars, metal oxides are able to quantitatively remove inorganic and organic pollutants as well as pathogens from water. Each water filter has a defined removal capacity and must be replaced when this capacity is exhausted. Operational experience has shown that it may be difficult to convince some low-skilled users to buy new filters after a predicted service life. This communication describes the quest to develop a filter-clock to encourage all users to change their filters after the designed service life. A brief discussion on such a filter-clock based on rusting of metallic iron (Fe0 is presented. Integrating such filter-clocks in the design of water filters is regarded as essential for safeguarding public health.

  2. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  3. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  4. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  5. Development of particle filters for ships; Udvikling af partikelfiltre til skibe

    Energy Technology Data Exchange (ETDEWEB)

    Jakobsen, O.; Norre Holm, J.; Koecks, M. [Teknologisk Institut, Aarhus (Denmark)

    2013-04-01

    The project has resulted in a solution with a well-functioning maritime particle filter which reduces the particle emission significantly. The visible smoke from the vessels funnel, which typically is seen while manoeuvring in the harbour, is also reduced to a minimum. The system is constructed in such a way that the exhaust gases can be bypassed around the filter unit, in this situation to ensure the engines operation in case of filter clogging. The system has been provided with safety functions to prevent an excessive exhaust gas back-pressure and there are fitted remote controlled exhaust valves. Some of the challenges in the project have been the requirement from the engine manufacturer of keeping a low turbocharger back-pressure, besides the space conditions aboard the test vessel and the achievement of sufficient temperatures for regeneration of the particle filter. To oppose the requirement of low exhaust gas back-pressure, the filter housing was designed with space for twice as many monoliths as originally planned. In the funnel casing the original installations were removed to make space for the filter housing, and the system was enlarged with electrically controlled exhaust valves to improve the daily operation of the crew. The regeneration issue was solved by mounting electric automatically controlled heating elements in the filter housing and by an ash exhaust system. Regeneration is carried out by the crew when the vessel lies in harbour in the evening after the last tour of the day. Before mounting the particle filter, measurements were carried out aboard, showing a compound of particle emissions with an expected high NO{sub x}-level of 8.33 g/kW, whereas the other emissions were lower than expected at first. Especially HC and CO were very low, but also the particle mass (PM) had a relatively low value of 0.22 g/kWh. After commissioning the particle filter, a significant reduction of 93% of the particle number (N) was observed. A reduction in N was

  6. 24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.

    Science.gov (United States)

    2010-04-01

    ... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...

  7. Why nuclear power generation must be developed? A many-faceted verification of its irreplaceable role

    International Nuclear Information System (INIS)

    Kawai, Yuichi; Oda, Toshiyuki

    1998-01-01

    Given the poor public acceptance right now, the future of nuclear power development is not necessarily bright. Yet, from the energy security aspect, the role of nuclear power, already responsible for about 30% of Japan's generated output, is never negligible. Also, Japan could hardly meet the GHG reduction target under the Kyoto Protocol without carbon-free nuclear power generation. While Japan is required to deal with both energy security and global warming from now on, to satisfy the two concurrently without nuclear power development is nearly impossible in practical terms. We have to consider calmly how nuclear power generation should be understood and treated in our effort to ensure energy supply and mitigate global warming. With this study, the need for nuclear power development was verified anew by reevaluating nuclear power generation from many facets, which are energy (electricity) supply and demand, environmental measures, energy security, and cost. Verification results showed: On supply and demand, the absence of nuclear power causes an electricity shortage during peak hours; On environment, no GHG-free power sources but nuclear currently have a sufficient supply capacity; On energy security, nuclear fuel procurement sources are diverse and located in relatively stable areas; On cost, the strong yen and cheap oil favors fossil fuels, and the weak yen and dear oil does nuclear power, though depending on unpredictable elements to send their cost up, typically waste disposal cost incurred in nuclear power, and CO 2 reduction cost in fossil fuels. With all these factors taken into consideration, the best mix of power sources should be figured out. From the verification results, we can conclude that nuclear power is one of irreplaceable energy sources for Japan. To prepare for growing electricity demand and care the environment better, Japan has few choices but to increase the installed capacity of nuclear power generation in the years to come. (author)

  8. Filter arrays

    Science.gov (United States)

    Page, Ralph H.; Doty, Patrick F.

    2017-08-01

    The various technologies presented herein relate to a tiled filter array that can be used in connection with performance of spatial sampling of optical signals. The filter array comprises filter tiles, wherein a first plurality of filter tiles are formed from a first material, the first material being configured such that only photons having wavelengths in a first wavelength band pass therethrough. A second plurality of filter tiles is formed from a second material, the second material being configured such that only photons having wavelengths in a second wavelength band pass therethrough. The first plurality of filter tiles and the second plurality of filter tiles can be interspersed to form the filter array comprising an alternating arrangement of first filter tiles and second filter tiles.

  9. Particle filters for random set models

    CERN Document Server

    Ristic, Branko

    2013-01-01

    “Particle Filters for Random Set Models” presents coverage of state estimation of stochastic dynamic systems from noisy measurements, specifically sequential Bayesian estimation and nonlinear or stochastic filtering. The class of solutions presented in this book is based  on the Monte Carlo statistical method. The resulting  algorithms, known as particle filters, in the last decade have become one of the essential tools for stochastic filtering, with applications ranging from  navigation and autonomous vehicles to bio-informatics and finance. While particle filters have been around for more than a decade, the recent theoretical developments of sequential Bayesian estimation in the framework of random set theory have provided new opportunities which are not widely known and are covered in this book. These recent developments have dramatically widened the scope of applications, from single to multiple appearing/disappearing objects, from precise to imprecise measurements and measurement models. This book...

  10. UV filters for lighting of plants

    Energy Technology Data Exchange (ETDEWEB)

    Doehring, T.; Koefferlein, M.; Thiel, S.; Seidlitz, H.K.; Payer, H.D. [GSF-Forschungszentrum fuer Umwelt und Gesundheit GmbH, Oberschleissheim (Germany)

    1994-12-31

    Different filter glasses are available which provide absorption properties suitable for gradual changes of the spectral UV-B illumination of artificial lighting. Using a distinct set of lamps and filter glasses an acceptable simulation of the UV-B part of natural global radiation can be achieved. The ageing of these and other filter materials under the extreme UV radiation in the lamphouse of a solar simulator is presently unavoidable. This instability can be dealt with only by a precise spectral monitoring and by replacing the filters accordingly. For this reason attempts would be useful to develop real ozone filters which can replace glass filters. In any case chamber experiments require a careful selection of the filter material used and must be accompanied by a continuous UV-B monitoring.

  11. The development of search filters for adverse effects of surgical interventions in medline and Embase.

    Science.gov (United States)

    Golder, Su; Wright, Kath; Loke, Yoon Kong

    2018-03-31

    Search filter development for adverse effects has tended to focus on retrieving studies of drug interventions. However, a different approach is required for surgical interventions. To develop and validate search filters for medline and Embase for the adverse effects of surgical interventions. Systematic reviews of surgical interventions where the primary focus was to evaluate adverse effect(s) were sought. The included studies within these reviews were divided randomly into a development set, evaluation set and validation set. Using word frequency analysis we constructed a sensitivity maximising search strategy and this was tested in the evaluation and validation set. Three hundred and fifty eight papers were included from 19 surgical intervention reviews. Three hundred and fifty two papers were available on medline and 348 were available on Embase. Generic adverse effects search strategies in medline and Embase could achieve approximately 90% relative recall. Recall could be further improved with the addition of specific adverse effects terms to the search strategies. We have derived and validated a novel search filter that has reasonable performance for identifying adverse effects of surgical interventions in medline and Embase. However, we appreciate the limitations of our methods, and recommend further research on larger sample sizes and prospective systematic reviews. © 2018 The Authors Health Information and Libraries Journal published by John Wiley & Sons Ltd on behalf of Health Libraries Group.

  12. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  13. Cost/benefit evaluation of electrofibrous air filters

    International Nuclear Information System (INIS)

    Bergman, W.; Kuhl, W.; Biermann, A.; Lum, B.

    1986-01-01

    Experimental electric air filters based on the principle of superimposing an electric field over conventional fibrous air filters have been developed. The different experimental electric filters described in this report include prefilters for use in glove boxes and in ventilation systems, re-circulating air filters, electric HEPA filters, and high efficiency, high temperature air filters. In each case the large improvement in filter efficiency that occurs when a mechanical filter is electrified is demonstrated. Also a significant increase in the particle loading capacity of filters in many of our evaluations is demonstrated. Both laboratory and field test results are presented. This paper also demonstrates that the performance of all of our electric filter designs, except one, can be matched by conventional mechanical air filters and usually at a lower cost. The one exception is the high temperature, high efficiency electric air filter. In that case there is no mechanical filter media that can match the performance of the electric air filter. Our findings show that electric air filters are only cost effective compared to mechanical air filters when the performance of the mechanical air filter cannot be further improved by mechanical means. (author)

  14. A Novel Technique Using a Protection Filter During Fibrin Sheath Removal for Implanted Venous Access Device Dysfunction

    Energy Technology Data Exchange (ETDEWEB)

    Sotiriadis, Charalampos; Hajdu, Steven David [University Hospital of Lausanne, Cardiothoracic and Vascular Unit, Department of Radiology (Switzerland); Degrauwe, Sophie [University Hospital of Lausanne, Department of Cardiology (Switzerland); Barras, Heloise; Qanadli, Salah Dine, E-mail: salah.qanadli@chuv.ch [University Hospital of Lausanne, Cardiothoracic and Vascular Unit, Department of Radiology (Switzerland)

    2016-08-15

    With the increased use of implanted venous access devices (IVADs) for continuous long-term venous access, several techniques such as percutaneous endovascular fibrin sheath removal, have been described, to maintain catheter function. Most standard techniques do not capture the stripped fibrin sheath, which is subsequently released in the pulmonary circulation and may lead to symptomatic pulmonary embolism. The presented case describes an endovascular technique which includes stripping, capture, and removal of fibrin sheath using a novel filter device. A 64-year-old woman presented with IVAD dysfunction. Stripping was performed using a co-axial snare to the filter to capture the fibrin sheath. The captured fragment was subsequently removed for visual and pathological verification. No immediate complication was observed and the patient was discharged the day of the procedure.

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. Mobile filters in nuclear engineering

    International Nuclear Information System (INIS)

    Meuter, R.

    1979-01-01

    The need for filters with high efficiencies which may be used at any place originated in nuclear power plants. Filters of this type, called Filtermobil, have been developed by Sulzer. They have been used successfully in nuclear plants for several years. (orig.) [de

  17. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  18. SISTEM VERIFIKASI SIDIK JARI DENGAN METODE PENCOCOKAN BERBASIS BANK GABOR FILTER

    Directory of Open Access Journals (Sweden)

    A.A. K. Oka Sudana

    2009-05-01

    Full Text Available Many cases of fraud are exposed where someone, claiming to be someone else attempts to accesssomething. This situation challenges us to create a system that is both reliable and trustworthy to verify someone’sidentity. A biometric system represent an automatic recognition system based on the physiological characteristicsand/or behaviors of the human being. One of these biometric system is fingerprint recognition. Fingerprint verification system is a system to verify a human fingerprint according to the identity which has been claimed. Thusfingerprint image input is compared with the reference fingerprint image which has been kept in a database,according to the given identity. On this research, the Gabor Filter Bank-based Fingerprint Matching Method is used to extract the image features.On application, one of sample image will be matched with one of the database reference image. First, before matching, the sample image features need to be calculate using the aforementioned method. The resulting matchingprocess will find a reference image which has the shortest Euclidean distance with the sample image. This matchingprocess will only succeed if the sample image and the reference image are from the same person. Images used inthis research are fingerprint images with dimension 191 x 191 pixels. This research is looking for the successful rateand the burst time on the verification process. Result of the system testing show that Gabor Filter Bank Method canprovide high enough success level i.e. 95%. The items influence success level are subject to quality of fingerprint image and accuracy to find a reference point/core point, Similarity between query image and reference image is determined by projection distance both of images, and also threshold value is used to decide whether this image isvalid or not.

  19. A Formal Verification Method of Function Block Diagram

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun; Jee, Eun Kyoung; Jeon, Seung Jae; Park, Gee Yong; Kwon, Kee Choon

    2007-01-01

    Programmable Logic Controller (PLC), an industrial computer specialized for real-time applications, is widely used in diverse control systems in chemical processing plants, nuclear power plants or traffic control systems. As a PLC is often used to implement safety, critical embedded software, rigorous safety demonstration of PLC code is necessary. Function block diagram (FBD) is a standard application programming language for the PLC and currently being used in the development of a fully-digitalized reactor protection system (RPS), which is called the IDiPS, under the KNICS project. Therefore, verification issue of FBD programs is a pressing problem, and hence is of great importance. In this paper, we propose a formal verification method of FBD programs; we defined FBD programs formally in compliance with IEC 61131-3, and then translate the programs into Verilog model, and finally the model is verified using a model checker SMV. To demonstrate the feasibility and effective of this approach, we applied it to IDiPS which currently being developed under KNICS project. The remainder of this paper is organized as follows. Section 2 briefly describes Verilog and Cadence SMV. In Section 3, we introduce FBD2V which is a tool implemented to support the proposed FBD verification framework. A summary and conclusion are provided in Section 4

  20. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  1. Kernel-based noise filtering of neutron detector signals

    International Nuclear Information System (INIS)

    Park, Moon Ghu; Shin, Ho Cheol; Lee, Eun Ki

    2007-01-01

    This paper describes recently developed techniques for effective filtering of neutron detector signal noise. In this paper, three kinds of noise filters are proposed and their performance is demonstrated for the estimation of reactivity. The tested filters are based on the unilateral kernel filter, unilateral kernel filter with adaptive bandwidth and bilateral filter to show their effectiveness in edge preservation. Filtering performance is compared with conventional low-pass and wavelet filters. The bilateral filter shows a remarkable improvement compared with unilateral kernel and wavelet filters. The effectiveness and simplicity of the unilateral kernel filter with adaptive bandwidth is also demonstrated by applying it to the reactivity measurement performed during reactor start-up physics tests

  2. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  3. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  4. The construction of environments for development of test and verification technology -The development of advanced instrumentation and control technology-

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Ham, Chang Shick; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Kim, Jae Hee; Lee, Chang Soo

    1994-07-01

    Several problems were identified in digitalizing the I and C systems of NPPs. A scheme is divided into hardware and software to resolve these problems. Hardware verification and validation analyzed about common mode failure, commercial grade dedication process, electromagnetic competibility. We have reviewed codes and standards to be a consensus criteria among vendor, licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 in the United States Nuclear Regulatory Commision (NRC) and presented vendor's approaches to scope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. (Author)

  5. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    MacLean, G.; Fergusson, J.

    1998-01-01

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  6. FPGA Design and Verification Procedure for Nuclear Power Plant MMIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dongil; Yoo, Kawnwoo; Ryoo, Kwangki [Hanbat National Univ., Daejeon (Korea, Republic of)

    2013-05-15

    In this paper, it is shown that it is possible to ensure reliability by performing the steps of the verification based on the FPGA development methodology, to ensure the safety of application to the NPP MMIS of the FPGA run along the step. Currently, the PLC (Programmable Logic Controller) which is being developed is composed of the FPGA (Field Programmable Gate Array) and CPU (Central Processing Unit). As the importance of the FPGA in the NPP (Nuclear Power Plant) MMIS (Man-Machine Interface System) has been increasing than before, the research on the verification of the FPGA has being more and more concentrated recently.

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Science.gov (United States)

    2010-07-01

    .... Successive mass determinations of each reference PM sample media (e.g., filter) must return the same value... individual test media (e.g., filter) mass readings occurring between the successive reference media (e.g., filter) mass determinations. You may reweigh these media (e.g., filter) in another weighing session. If...

  8. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets...... the criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth component of the Filter and what issues remained to be resolved. RESULTS: The case studies showed...... that there is broad agreement on criteria for meeting the Truth criteria through demonstration of content, face, and construct validity; however, several issues were identified that the Filter Working Group will need to address. CONCLUSION: These issues will require resolution to reach consensus on how Truth...

  9. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes...... for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. METHODS: Discussion groups critically reviewed the extent to which case......, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. CONCLUSION: These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of Core Outcome...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: PROTOCOL FOR THE VERIFICATION OF GROUTING MATERIALS FOR INFRASTRUCTURE REHABILITATION AT THE UNIVERSITY OF HOUSTON - CIGMAT

    Science.gov (United States)

    This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...

  11. Digital Simulation of a Hybrid Active Filter - An Active Filter in Series with a Shunt Passive Filter

    OpenAIRE

    Sitaram, Mahesh I; Padiyar, KR; Ramanarayanan, V

    1998-01-01

    Active filters have long been in use for the filtering of power system load harmonics. In this paper, the digital simulation results of a hybrid active power filter system for a rectifier load are presented. The active filter is used for filtering higher order harmonics as the dominant harmonics are filtered by the passive filter. This reduces the rating of the active filter significantly. The DC capacitor voltage of the active filter is controlled using a PI controller.

  12. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  13. Determination of permeability of ultra-fine cupric oxide aerosol through military filters and protective filters

    Science.gov (United States)

    Kellnerová, E.; Večeřa, Z.; Kellner, J.; Zeman, T.; Navrátil, J.

    2018-03-01

    The paper evaluates the filtration and sorption efficiency of selected types of military combined filters and protective filters. The testing was carried out with the use of ultra-fine aerosol containing cupric oxide nanoparticles ranging in size from 7.6 nm to 299.6 nm. The measurements of nanoparticles were carried out using a scanning mobility particle sizer before and after the passage through the filter and a developed sampling device at the level of particle number concentration approximately 750000 particles·cm-3. The basic parameters of permeability of ultra-fine aerosol passing through the tested material were evaluated, in particular particle size, efficiency of nanoparticle capture by filter, permeability coefficient and overall filtration efficiency. Results indicate that the military filter and particle filters exhibited the highest aerosol permeability especially in the nanoparticle size range between 100–200 nm, while the MOF filters had the highest permeability in the range of 200 to 300 nm. The Filter Nuclear and the Health and Safety filter had 100% nanoparticle capture efficiency and were therefore the most effective. The obtained measurement results have shown that the filtration efficiency over the entire measured range of nanoparticles was sufficient; however, it was different for particular particle sizes.

  14. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  15. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-04] Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer... Following Information Title of Proposal: Enterprise Income Verification (EIV) System- Debts Owed to PHAs and...

  16. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  17. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  18. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  19. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    Science.gov (United States)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  20. Sustaining a verification regime in a nuclear weapon-free world. VERTIC research report no. 4

    International Nuclear Information System (INIS)

    Moyland, S. van

    1999-01-01

    Sustaining high levels of commitment to and enthusiasm for the verification regime in a nuclear weapon-free world (NWFW) would be a considerable challenge, but the price of failure would be high. No verification system for a complete ban on a whole of weapon of mass destruction (WMD) has been in existence long enough to provide a precedent or the requisite experience. Nevertheless, lessons from the International Atomic Energy Agency's (IAEA) nuclear safeguards system are instructive. A potential problem over the long haul is the gradual erosion of the deterrent effect of verification that may result from the continual overlooking of minor instances of non-compliance. Flaws in the verification system must be identified and dealt with early lest they also corrode the system. To achieve this the verification organisation's inspectors and analytical staff will need sustained support, encouragement, resources and training. In drawing attention to weaknesses, they must be supported by management and at the political level. The leaking of sensitive information, either industrial or military, by staff of the verification regime is a potential problem. 'Managed access' techniques should be constantly examined and improved. The verification organisation and states parties will need to sustain close co-operation with the nuclear and related industries. Frequent review mechanisms must be established. States must invest time and effort to make them effective. Another potential problem is the withering of resources for sustained verification. Verification organisations tend to be pressured by states to cut or last least cap costs, even if the verification workload increases. The verification system must be effective as knowledge and experience allows. The organisation will need continuously to update its scientific methods and technology. This requires in-house resources plus external research and development (R and D). Universities, laboratories and industry need incentives to

  1. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  2. Advanced Filtering Techniques Applied to Spaceflight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  3. Cast Steel Filtration Trials Using Ceramic-Carbon Filters

    Directory of Open Access Journals (Sweden)

    Lipowska B.

    2014-12-01

    Full Text Available Trials of cast steel filtration using two types of newly-developed foam filters in which carbon was the phase binding ceramic particles have been conducted. In one of the filters the source of carbon was flake graphite and coal-tar pitch, while in the other one graphite was replaced by a cheaper carbon precursor. The newly-developed filters are fired at 1000°C, i.e. at a much lower temperature than the currently applied ZrO2-based filters. During filtration trials the filters were subjected to the attack of a flowing metal stream having a temperature of 1650°C for 30 seconds.

  4. Verification of space weather forecasts at the UK Met Office

    Science.gov (United States)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  5. Dust characterisation for hot gas filters

    Energy Technology Data Exchange (ETDEWEB)

    Dockter, B.; Erickson, T.; Henderson, A.; Hurley, J.; Kuehnel, V.; Katrinak, K.; Nowok, J.; O`Keefe, C.; O`Leary, E.; Swanson, M.; Watne, T. [University of North Dakota, Grand Forks, ND (United States). Energy and Environmental Research Center (UNDEERC)

    1998-03-01

    Hot gas filtration to remove particulates from the gas flow upstream of the gas turbine is critical to the development of many of the advanced coal-fired power generation technologies such as the Air Blown Gasification Cycle (ABGC), a hybrid gasification combined cycle being developed in the UK. Ceramic candle filters are considered the most promising technology for this purpose. Problems of mechanical failure and of `difficult-to-clean` dusts causing high pressure losses across the filter elements need to be solved. The project investigated the behaviour of high-temperature filter dusts, and the factors determining the ease with which they can be removed from filters. The high-temperature behaviour of dusts from both combustion and gasification systems was investigated. Dust samples were obtained from full-scale demonstration and pilot-scale plant operating around the world. Dust samples were also produced from a variety of coals, and under several different operating conditions, on UNDEERC`s pilot-scale reactor. Key factors affecting dust behaviour were examined, including: the rates of tensile strength developing in dust cakes; the thermochemical equilibria pertaining under filtration conditions; dust adhesivity on representative filter materials; and the build-up and cleaning behaviour of dusts on representative filter candles. The results obtained confirmed the importance of dust temperature, dust cake porosity, cake liquid content, and particle size distribution in determining the strength of a dust cake. An algorithm was developed to indicate the likely sticking propensity of dusts as a function of coal and sorbent composition and combustion conditions. This algorithm was incorporated into a computer package which can be used to judge the degree of difficulty in filter cleaning that can be expected to arise in a real plant based on operating parameters and coal analyzes. 6 figs.

  6. Development of high efficiency filtered containment venting system by using AgX

    International Nuclear Information System (INIS)

    Narabayashi, Tadashi; Fujii, Yasuhiro; Chiba, Go; Tsuji, Masashi; Ishii, Tasuku

    2014-01-01

    Fukushima Daiichi NPP accident would be terminated, if sufficient accident countermeasures, such as water proof door, mobile power, etc. In case of Europe, it had already installed the heat removal system and filtered containment venting system (FCVS) from the lessons of TMI and Chernobyl Accidents. Decay heat removal system and CV spray cooling system with FCVS are ensured by using mobile generators and heat exchangers to keep the ultimate heat sink even in any natural disaster, such as large earthquake, big tsunami, sudden flooding etc. In this paper we introduce high decontamination factor FCVS that used Silver Zeolite named AgX, developed by Rasa Industries, Ltd. Hokkaido University has tested wet type FCVS using venturi scrubber in water pool and dry type FCVS using metallic filter for 1st stage, and AgX for 2nd stage. Since the AgX needs super heat steam, it is possible to heat up steam by heat exchanger. It is confirmed by TRAC analysis. (author)

  7. 24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.

    Science.gov (United States)

    2010-04-01

    ... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...

  8. Blended particle filters for large-dimensional chaotic dynamical systems

    Science.gov (United States)

    Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.

    2014-01-01

    A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886

  9. Development of a new linearly variable edge filter (LVEF)-based compact slit-less mini-spectrometer

    Science.gov (United States)

    Mahmoud, Khaled; Park, Seongchong; Lee, Dong-Hoon

    2018-02-01

    This paper presents the development of a compact charge-coupled detector (CCD) spectrometer. We describe the design, concept and characterization of VNIR linear variable edge filter (LVEF)- based mini-spectrometer. The new instrument has been realized for operation in the 300 nm to 850 nm wavelength range. The instrument consists of a linear variable edge filter in front of CCD array. Low-size, light-weight and low-cost could be achieved using the linearly variable filters with no need to use any moving parts for wavelength selection as in the case of commercial spectrometers available in the market. This overview discusses the main components characteristics, the main concept with the main advantages and limitations reported. Experimental characteristics of the LVEFs are described. The mathematical approach to get the position-dependent slit function of the presented prototype spectrometer and its numerical de-convolution solution for a spectrum reconstruction is described. The performance of our prototype instrument is demonstrated by measuring the spectrum of a reference light source.

  10. Development and testing of a two stage granular filter to improve collection efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Rangan, R.S.; Prakash, S.G.; Chakravarti, S.; Rao, S.R.

    1999-07-01

    A circulating bed granular filter (CBGF) with a single filtration stage was tested with a PFB combustor in the Coal Research Facility of BHEL R and D in Hyderabad during the years 1993--95. Filter outlet dust loading varied between 20--50 mg/Nm{sup 3} for an inlet dust loading of 5--8 gms/Nm{sup 3}. The results were reported in Fluidized Bed Combustion-Volume 2, ASME 1995. Though the outlet consists of predominantly fine particulates below 2 microns, it is still beyond present day gas turbine specifications for particulate concentration. In order to enhance the collection efficiency, a two-stage granular filtration concept was evolved, wherein the filter depth is divided between two stages, accommodated in two separate vertically mounted units. The design also incorporates BHEL's scale-up concept of multiple parallel stages. The two-stage concept minimizes reentrainment of captured dust by providing clean granules in the upper stage, from where gases finally exit the filter. The design ensures that dusty gases come in contact with granules having a higher dust concentration at the bottom of the two-stage unit, where most of the cleaning is completed. A second filtration stage of cleaned granules is provided in the top unit (where the granules are returned to the system after dedusting) minimizing reentrainment. Tests were conducted to determine the optimum granule to dust ratio (G/D ratio) which decides the granule circulation rate required for the desired collection efficiency. The data brings out the importance of pre-separation and the limitation on inlet dust loading for any continuous system of granular filtration. Collection efficiencies obtained were much higher (outlet dust being 3--9 mg/Nm{sub 3}) than in the single stage filter tested earlier for similar dust loading at the inlet. The results indicate that two-stage granular filtration has a high potential for HTHT application with fewer risks as compared to other systems under development.

  11. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle Ida Antoinette; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that

  12. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    Science.gov (United States)

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  13. Filter assembly for metallic and intermetallic tube filters

    Science.gov (United States)

    Alvin, Mary Anne; Lippert, Thomas E.; Bruck, Gerald J.; Smeltzer, Eugene E.

    2001-01-01

    A filter assembly (60) for holding a filter element (28) within a hot gas cleanup system pressure vessel is provided, containing: a filter housing (62), said filter housing having a certain axial length and having a peripheral sidewall, said sidewall defining an interior chamber (66); a one piece, all metal, fail-safe/regenerator device (68) within the interior chamber (66) of the filter housing (62) and/or extending beyond the axial length of the filter housing, said device containing an outward extending radial flange (71) within the filter housing for seating an essential seal (70), the device also having heat transfer media (72) disposed inside and screens (80) for particulate removal; one compliant gasket (70) positioned next to and above the outward extending radial flange of the fail-safe/regenerator device; and a porous metallic corrosion resistant superalloy type filter element body welded at the bottom of the metal fail-safe/regenerator device.

  14. Statistically-Efficient Filtering in Impulsive Environments: Weighted Myriad Filters

    Directory of Open Access Journals (Sweden)

    Juan G. Gonzalez

    2002-01-01

    Full Text Available Linear filtering theory has been largely motivated by the characteristics of Gaussian signals. In the same manner, the proposed Myriad Filtering methods are motivated by the need for a flexible filter class with high statistical efficiency in non-Gaussian impulsive environments that can appear in practice. Myriad filters have a solid theoretical basis, are inherently more powerful than median filters, and are very general, subsuming traditional linear FIR filters. The foundation of the proposed filtering algorithms lies in the definition of the myriad as a tunable estimator of location derived from the theory of robust statistics. We prove several fundamental properties of this estimator and show its optimality in practical impulsive models such as the α-stable and generalized-t. We then extend the myriad estimation framework to allow the use of weights. In the same way as linear FIR filters become a powerful generalization of the mean filter, filters based on running myriads reach all of their potential when a weighting scheme is utilized. We derive the “normal” equations for the optimal myriad filter, and introduce a suboptimal methodology for filter tuning and design. The strong potential of myriad filtering and estimation in impulsive environments is illustrated with several examples.

  15. Filter material charging apparatus for filter assembly for radioactive contaminants

    International Nuclear Information System (INIS)

    Goldsmith, J.M.; O'Nan, A. Jr.

    1977-01-01

    A filter charging apparatus for a filter assembly is described. The filter assembly includes a housing with at least one filter bed therein and the filter charging apparatus for adding filter material to the filter assembly includes a tank with an opening therein, the tank opening being disposed in flow communication with opposed first and second conduit means, the first conduit means being in flow communication with the filter assembly housing and the second conduit means being in flow communication with a blower means. Upon activation of the blower means, the blower means pneumatically conveys the filter material from the tank to the filter housing

  16. Development of gel-filter method for high enrichment of low-molecular weight proteins from serum.

    Directory of Open Access Journals (Sweden)

    Lingsheng Chen

    Full Text Available The human serum proteome has been extensively screened for biomarkers. However, the large dynamic range of protein concentrations in serum and the presence of highly abundant and large molecular weight proteins, make identification and detection changes in the amount of low-molecular weight proteins (LMW, molecular weight ≤ 30kDa difficult. Here, we developed a gel-filter method including four layers of different concentration of tricine SDS-PAGE-based gels to block high-molecular weight proteins and enrich LMW proteins. By utilizing this method, we identified 1,576 proteins (n = 2 from 10 μL serum. Among them, 559 (n = 2 proteins belonged to LMW proteins. Furthermore, this gel-filter method could identify 67.4% and 39.8% more LMW proteins than that in representative methods of glycine SDS-PAGE and optimized-DS, respectively. By utilizing SILAC-AQUA approach with labeled recombinant protein as internal standard, the recovery rate for GST spiked in serum during the treatment of gel-filter, optimized-DS, and ProteoMiner was 33.1 ± 0.01%, 18.7 ± 0.01% and 9.6 ± 0.03%, respectively. These results demonstrate that the gel-filter method offers a rapid, highly reproducible and efficient approach for screening biomarkers from serum through proteomic analyses.

  17. Development of an optimal filter substrate for the identification of small microplastic particles in food by micro-Raman spectroscopy.

    Science.gov (United States)

    Oßmann, Barbara E; Sarau, George; Schmitt, Sebastian W; Holtmannspötter, Heinrich; Christiansen, Silke H; Dicke, Wilhelm

    2017-06-01

    When analysing microplastics in food, due to toxicological reasons it is important to achieve clear identification of particles down to a size of at least 1 μm. One reliable, optical analytical technique allowing this is micro-Raman spectroscopy. After isolation of particles via filtration, analysis is typically performed directly on the filter surface. In order to obtain high qualitative Raman spectra, the material of the membrane filters should not show any interference in terms of background and Raman signals during spectrum acquisition. To facilitate the usage of automatic particle detection, membrane filters should also show specific optical properties. In this work, beside eight different, commercially available membrane filters, three newly designed metal-coated polycarbonate membrane filters were tested to fulfil these requirements. We found that aluminium-coated polycarbonate membrane filters had ideal characteristics as a substrate for micro-Raman spectroscopy. Its spectrum shows no or minimal interference with particle spectra, depending on the laser wavelength. Furthermore, automatic particle detection can be applied when analysing the filter surface under dark-field illumination. With this new membrane filter, analytics free of interference of microplastics down to a size of 1 μm becomes possible. Thus, an important size class of these contaminants can now be visualized and spectrally identified. Graphical abstract A newly developed aluminium coated polycarbonate membrane filter enables automatic particle detection and generation of high qualitative Raman spectra allowing identification of small microplastics.

  18. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin Fangfang; Gao Qinghuai; Xie Huchen; Nelson, Diana F.; Yu Yan; Kwok, W. Edmund; Totterman, Saara; Schell, Michael C.; Rubin, Philip

    1998-01-01

    Purpose: To investigate a method for the generation of digitally reconstructed radiographs directly from MR images (DRR-MRI) to guide a computerized portal verification procedure. Methods and Materials: Several major steps were developed to perform an MR image-guided portal verification procedure. Initially, a wavelet-based multiresolution adaptive thresholding method was used to segment the skin slice-by-slice in MR brain axial images. Some selected anatomical structures, such as target volume and critical organs, were then manually identified and were reassigned to relatively higher intensities. Interslice information was interpolated with a directional method to achieve comparable display resolution in three dimensions. Next, a ray-tracing method was used to generate a DRR-MRI image at the planned treatment position, and the ray tracing was simply performed on summation of voxels along the ray. The skin and its relative positions were also projected to the DRR-MRI and were used to guide the search of similar features in the portal image. A Canny edge detector was used to enhance the brain contour in both portal and simulation images. The skin in the brain portal image was then extracted using a knowledge-based searching technique. Finally, a Chamfer matching technique was used to correlate features between DRR-MRI and portal image. Results: The MR image-guided portal verification method was evaluated using a brain phantom case and a clinical patient case. Both DRR-CT and DRR-MRI were generated using CT and MR phantom images with the same beam orientation and then compared. The matching result indicated that the maximum deviation of internal structures was less than 1 mm. The segmented results for brain MR slice images indicated that a wavelet-based image segmentation technique provided a reasonable estimation for the brain skin. For the clinical patient case with a given portal field, the MR image-guided verification method provided an excellent match between

  19. Role of IVC Filters in Endovenous Therapy for Deep Venous Thrombosis: The FILTER-PEVI (Filter Implantation to Lower Thromboembolic Risk in Percutaneous Endovenous Intervention) Trial

    International Nuclear Information System (INIS)

    Sharifi, Mohsen; Bay, Curt; Skrocki, Laura; Lawson, David; Mazdeh, Shahnaz

    2012-01-01

    Objectives: The purpose of this study was to evaluate the necessity of and recommend indications for inferior vena cava (IVC) filter implantation during percutaneous endovenous intervention (PEVI) for deep venous thrombosis (DVT).BackgroundPEVI has emerged as a powerful tool in the management of acute proximal DVT. Instrumentation of extensive fresh thrombus is potentially associated with iatrogenic pulmonary embolism (PE). The true frequency of this complication has not been studied in a randomized fashion. We evaluated IVC filter implantation during PEVI for DVT. Methods: A total of 141 patients with symptomatic proximal DVT undergoing PEVI for symptomatic DVT were randomized to receive an IVC filter (70 patients) or no filter (71 patients; control group). The anticoagulation and PEVI regimen were similar between the two groups. Patients with development of symptoms suggestive of PE underwent objective testing for PE. Results: PE developed in 1 of the 14 symptomatic patients in the filter group and 8 of the 22 patients in the control group (P = 0.048). There was no mortality in any group. Three patients (4.2%) in the control group had transient hemodynamic instability necessitating resuscitory efforts. Predictors of iatrogenic PE were found to be PE at admission; involvement of two or more adjacent venous segments with acute thrombus; inflammatory form of DVT (severe erythema, edema, pain, and induration); and vein diameter of ≥7 mm with preserved architecture. Conclusions: IVC filter implantation during PEVI reduces the risk of iatrogenic PE by eightfold without a mortality benefit. A selective approach may be exercised in filter implantation during PEVI.

  20. Role of IVC Filters in Endovenous Therapy for Deep Venous Thrombosis: The FILTER-PEVI (Filter Implantation to Lower Thromboembolic Risk in Percutaneous Endovenous Intervention) Trial

    Energy Technology Data Exchange (ETDEWEB)

    Sharifi, Mohsen, E-mail: seyedmohsensharifi@yahoo.com [Arizona Cardiovascular Consultants (United States); Bay, Curt [A.T. Still University, Arizona School of Health Sciences (United States); Skrocki, Laura; Lawson, David; Mazdeh, Shahnaz [Arizona Cardiovascular Consultants (United States)

    2012-12-15

    Objectives: The purpose of this study was to evaluate the necessity of and recommend indications for inferior vena cava (IVC) filter implantation during percutaneous endovenous intervention (PEVI) for deep venous thrombosis (DVT).BackgroundPEVI has emerged as a powerful tool in the management of acute proximal DVT. Instrumentation of extensive fresh thrombus is potentially associated with iatrogenic pulmonary embolism (PE). The true frequency of this complication has not been studied in a randomized fashion. We evaluated IVC filter implantation during PEVI for DVT. Methods: A total of 141 patients with symptomatic proximal DVT undergoing PEVI for symptomatic DVT were randomized to receive an IVC filter (70 patients) or no filter (71 patients; control group). The anticoagulation and PEVI regimen were similar between the two groups. Patients with development of symptoms suggestive of PE underwent objective testing for PE. Results: PE developed in 1 of the 14 symptomatic patients in the filter group and 8 of the 22 patients in the control group (P = 0.048). There was no mortality in any group. Three patients (4.2%) in the control group had transient hemodynamic instability necessitating resuscitory efforts. Predictors of iatrogenic PE were found to be PE at admission; involvement of two or more adjacent venous segments with acute thrombus; inflammatory form of DVT (severe erythema, edema, pain, and induration); and vein diameter of {>=}7 mm with preserved architecture. Conclusions: IVC filter implantation during PEVI reduces the risk of iatrogenic PE by eightfold without a mortality benefit. A selective approach may be exercised in filter implantation during PEVI.

  1. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  2. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  3. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  4. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  5. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  6. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  7. Metal fuel development and verification for prototype generation- IV Sodium- Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bock; Cheon, Jin Sik; Kim, Sung Ho; Park, Jeong Yong; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U-Zr fuel is a driver for the initial core of the PGSFR, and U -transuranics (TRU)-Zr fuel will gradually replace U-Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U-Zr fuel, work on U-Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U-TRU-Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor) fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic-martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  8. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    Directory of Open Access Journals (Sweden)

    Chan Bock Lee

    2016-10-01

    Full Text Available Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic–martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  9. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  10. Optimal search filters for renal information in EMBASE.

    Science.gov (United States)

    Iansavichus, Arthur V; Haynes, R Brian; Shariff, Salimah Z; Weir, Matthew; Wilczynski, Nancy L; McKibbon, Ann; Rehman, Faisal; Garg, Amit X

    2010-07-01

    EMBASE is a popular database used to retrieve biomedical information. Our objective was to develop and test search filters to help clinicians and researchers efficiently retrieve articles with renal information in EMBASE. We used a diagnostic test assessment framework because filters operate similarly to screening tests. We divided a sample of 5,302 articles from 39 journals into development and validation sets of articles. Information retrieval properties were assessed by treating each search filter as a "diagnostic test" or screening procedure for the detection of relevant articles. We tested the performance of 1,936,799 search filters made of unique renal terms and their combinations. REFERENCE STANDARD & OUTCOME: The reference standard was manual review of each article. We calculated the sensitivity and specificity of each filter to identify articles with renal information. The best renal filters consisted of multiple search terms, such as "renal replacement therapy," "renal," "kidney disease," and "proteinuria," and the truncated terms "kidney," "dialy," "neph," "glomerul," and "hemodial." These filters achieved peak sensitivities of 98.7% (95% CI, 97.9-99.6) and specificities of 98.5% (95% CI, 98.0-99.0). The retrieval performance of these filters remained excellent in the validation set of independent articles. The retrieval performance of any search will vary depending on the quality of all search concepts used, not just renal terms. We empirically developed and validated high-performance renal search filters for EMBASE. These filters can be programmed into the search engine or used on their own to improve the efficiency of searching.

  11. Development and testing the modular fireproof fine filters on the basis of glass paper

    International Nuclear Information System (INIS)

    Rovnyj, S.I.; Glagolenko, Yu.V.; Pyatin, N.P.; Tranchuk, O.A.; Maksimov, V.E.; Afanas'eva, E.V.

    2006-01-01

    Paper describes a procedure to fabricate modified module glass paper fine filters to trap radioactive substances (14 models). The mentioned filters are made of a glass paper ensuring their fire-resistance. Paper describes the procedure of service life tests of the designed filters and the efficient procedure to extract valuable components from the spent filters [ru

  12. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  13. Development of a predictive model for 6 month survival in patients with venous thromboembolism and solid malignancy requiring IVC filter placement.

    Science.gov (United States)

    Huang, Steven Y; Odisio, Bruno C; Sabir, Sharjeel H; Ensor, Joe E; Niekamp, Andrew S; Huynh, Tam T; Kroll, Michael; Gupta, Sanjay

    2017-07-01

    Our purpose was to develop a predictive model for short-term survival (i.e. filter placement in patients with venous thromboembolism (VTE) and solid malignancy. Clinical and laboratory parameters were retrospectively reviewed for patients with solid malignancy who received a filter between January 2009 and December 2011 at a tertiary care cancer center. Multivariate Cox proportional hazards modeling was used to assess variables associated with 6 month survival following filter placement in patients with VTE and solid malignancy. Significant variables were used to generate a predictive model. 397 patients with solid malignancy received a filter during the study period. Three variables were associated with 6 month survival: (1) serum albumin [hazard ratio (HR) 0.496, P filter placement can be predicted from three patient variables. Our predictive model could be used to help physicians decide whether a permanent or retrievable filter may be more appropriate as well as to assess the risks and benefits for filter retrieval within the context of survival longevity in patients with cancer.

  14. Hot-Gas Filter Ash Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, M.L.; Hurley, J.P.; Dockter, B.A.; O`Keefe, C.A.

    1997-07-01

    Large-scale hot-gas filter testing over the past 10 years has revealed numerous cases of cake buildup on filter elements that has been difficult, if not impossible, to remove. At times, the cake can blind or bridge between candle filters, leading to filter failure. Physical factors, including particle-size distribution, particle shape, the aerodynamics of deposition, and system temperature, contribute to the difficulty in removing the cake, but chemical factors such as surface composition and gas-solid reactions also play roles in helping to bond the ash to the filters or to itself. This project is designed to perform the research necessary to determine the fuel-, sorbent-, and operations-related conditions that lead to blinding or bridging of hot-gas particle filters. The objectives of the project are threefold: (1) Determine the mechanisms by which a difficult-to-clean ash is formed and how it bridges hot-gas filters (2) Develop a method to determine the rate of bridging based on analyses of the feed coal and sorbent, filter properties, and system operating conditions and (3) Suggest and test ways to prevent filter bridging.

  15. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  16. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  17. Experimental investigation of in situ cleanable HEPA filter

    International Nuclear Information System (INIS)

    Adamson, D.J.

    1999-01-01

    The Westinghouse Savannah River Company located at the Savannah River Site (SRS) in Aiken, South Carolina is currently testing the feasibility of developing an in situ cleanable high efficiency particulate air (HEPA) filter system. Sintered metal filters are being tested for regenerability or cleanability in simulated conditions found in a high level waste (HLW) tank ventilation system. The filters are being challenged using materials found in HLW tanks. HLW simulated salt, HLW simulated sludge and South Carolina road dust. Various cleaning solutions have been used to clean the filters in situ. The tanks are equipped with a ventilation system to maintain the tank contents at negative pressure to prevent the release of radioactive material to the environment. This system is equipped with conventional disposable glass-fiber HEPA filter cartridges. Removal and disposal of these filters is not only costly, but subjects site personnel to radiation exposure and possible contamination. A test apparatus was designed to simulate the ventilation system of a HLW tank with an in situ cleaning system. Test results indicate that the Mott sintered metal HEPA filter is suitable as an in situ cleanable or regenerable HEPA filter. Data indicates that high humidity or water did not effect the filter performance and the sintered metal HEPA filter was easily cleaned numerous times back to new filter performance by an in situ spray system. The test apparatus allows the cleaning of the soiled HEPA filters to be accomplished without removing the filters from process. This innovative system would eliminate personnel radiation exposure associated with removal of contaminated filters and the high costs of filter replacement and disposal. The results of these investigations indicate that an in situ cleanable HEPA filter system for radioactive and commercial use could be developed and manufactured

  18. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  19. Optional inferior vena caval filters: where are we now?

    LENUS (Irish Health Repository)

    Keeling, A N

    2008-08-01

    With the advent of newer optional\\/retrievable inferior vena caval filters, there has been a rise in the number of filters inserted globally. This review article examines the currently available approved optional filter models, outlines the clinical indications for filter insertion and examines the expanding indications. Additionally, the available evidence behind the use of optional filters is reviewed, the issue of anticoagulation is discussed and possible future filter developments are considered.

  20. Development of exhaust air filters for reprocessing plants

    International Nuclear Information System (INIS)

    Furrer, J.; Kaempffer, R.; Jannakos, K.; Apenberg, W.

    1975-01-01

    Investigations of the iodine loading capacity of highly impregnated iodine sorption material (AC 6,120/H 1 ) for the GWA-filters (GWA: reprocessing plant for 1,500 metric tons per year of uranium) have been continued for low NO 2 -contents of the simulated dissolver offgas from GWA. When fully loading AC 6,120/H 1 , a conversion to silver iodides of Ag + of the impregnation of about 80% was reached in experiments with 1% NO 2 in the carrier gas. Despite the consumption of a substantial portion of the impregnation removal efficiencies > 99.99% were measured for a bed depth corresponding to a GWA filter stage. The test facility allowing to examine the behavior and the capacity of the AC 6,120/H 1 iodine sorption material under actual conditions at SAP Marcoule (reprocessing plant) has been completed except for installation in the reprocessing plant. (orig.) [de

  1. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  2. Building a Simulated Environment for the Study of Multilateral Approaches to Nuclear Materials Verification

    International Nuclear Information System (INIS)

    Moul, R.; Persbo, A.; Keir, D.

    2015-01-01

    Verification research can be resource-intensive, particularly when it relies on practical or field exercises. These exercises can also involve substantial logistical preparations and are difficult to run in an iterative manner to produce data sets that can be later utilized in verification research. This paper presents the conceptual framework, methodology and preliminary findings from part of a multi-year research project, led by VERTIC. The multi-component simulated environment that we have generated, using existing computer models for nuclear reactors and other components of fuel cycles, can be used to investigate options for future multilateral nuclear verification, at a variety of locations and time points in a nuclear complex. We have constructed detailed fuel cycle simulations for two fictional, and very different, states. In addition to these mass-flow models, a 3-dimensional, avatarbased simulation of a nuclear facility is under development. We have also developed accompanying scenarios-that provide legal and procedural assumptions that will control the process of our fictional verification solutions. These tools have all been produced using open source information and software. While these tools are valuable for research purposes, they can also play an important role in support of training and education in the field of nuclear materials verification, in a variety of settings and circumstances. (author)

  3. z-transform DFT filters and FFT's

    DEFF Research Database (Denmark)

    Bruun, G.

    1978-01-01

    The paper shows how discrete Fourier transformation can be implemented as a filter bank in a way which reduces the number of filter coefficients. A particular implementation of such a filter bank is directly related to the normal complex FFT algorithm. The principle developed further leads to types...... of DFT filter banks which utilize a minimum of complex coefficients. These implementations lead to new forms of FFT's, among which is acos/sinFFT for a real signal which only employs real coefficients. The new FFT algorithms use only half as many real multiplications as does the classical FFT....

  4. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  5. Utilization of plastics as transparent x-ray filter

    International Nuclear Information System (INIS)

    Masuda, Yathuhiko; Inui, Saburo; Kooda, Kazunao; Takiguchi, Kiyomi; Abe, Yoshinobu.

    1980-01-01

    An attempt has been made to develop heavy atom containing transparent plastic filters which are identical with conventional aluminum or copper filters in X-ray attenuating property. These transparent filters can be used as fixed at the front of a conventional multilayer collimator without obstructing the optical detection of the field size of X-ray exposure. It has become a serious problem that recent increasing use of X-ray in diagnostics, namely increasing patient exposure, may cause baneful influence upon the patients. To reduce such patient exposure, the I.C.R.P. has recommended the proper use of metal filters made of aluminum or copper with regards to the applied tube potential. These filters are generally used as fixed at the X-ray tube window or used at the front of a multilayer collimator as added filters. In the former case, and exchange of filters to select the best one with regards to the applied tube potential needs complicated works, and in the latter, the use of the added filters also needs complicated works to confirm field size before each radiography. These troublesome works have at time resulted in improper uses of the filters although the effective selection of filters is known to be useful to the reduction of patient exposure. Therefore, the problem of reduction of patient exposure by means of filtration still remains practically unsolved. To offer practical added filters which do not possess above mentioned disadvantages of metal filters, we tried to develop transparent added filters. Transparent plastics as the material of the filters were loaded with heavy atoms to equalize X-ray attenuating property with aluminum or copper. (author)

  6. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  7. Societal Verification: Intellectual Game or International Game-Changer

    International Nuclear Information System (INIS)

    Hartigan, Kelsey; Hinderstein, Corey

    2013-01-01

    Within the nuclear nonproliferation and arms control field, there is an increasing appreciation for the potential of open source information technologies to supplement existing verification and compliance regimes. While clearly not a substitute for on-site inspections or national technical means, it may be possible to better leverage information gleaned from commercial satellite imagery, international trade records and the vast amount of data being exchanged online and between publics (including social media) so as to develop a more comprehensive set of tools and practices for monitoring and verifying a state’s nuclear activities and helping judge compliance with international obligations. The next generation “toolkit” for monitoring and verifying items, facility operations and activities will likely include a more diverse set of analytical tools and technologies than are currently used internationally. To explore these and other issues, the Nuclear Threat Initiative has launched an effort that examines, in part, the role that emerging technologies and “citizen scientists” might play in future verification regimes. This paper will include an assessment of past proliferation and security “events” and whether emerging tools and technologies would have provided indicators concurrently or in advance of these actions. Such case studies will be instrumental in understanding the reliability of these technologies and practices and in thinking through the requirements of a 21st century verification regime. Keywords: Verification, social media, open-source information, arms control, disarmament.

  8. Anti-Aliasing filter for reverse-time migration

    KAUST Repository

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  9. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  10. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  11. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  12. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1993-01-01

    We have completed a preliminary study of an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of the standard deep pleated HEPA filter under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Three prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan (1,700 m 3 /hr) at 700 degrees F (371 degrees C) for five minutes.The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. (2.5 kPa) using a water saturated air flow at 95 degrees F (35 degrees C). For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter. Further studies are recommended to evaluate the improved HEPA filter and to assess its performance under more severe accident conditions

  13. DEVELOPMENT OF ENRICHMENT VERIFICATION ASSAY BASED ON THE AGE AND 235U AND 238U ACTIVITIES OF THE SAMPLES

    International Nuclear Information System (INIS)

    AL-YAMAHI, H.; EL-MONGY, S.A.

    2008-01-01

    Development of the enrichment verification methods is the backbone of the nuclear materials safeguards skeleton. In this study, the 235U percentage of depleted , natural and very slightly enriched uranium samples were estimated based on the sample age and the measured activity of 235U and 238U. The HpGe and NaI spectrometry were used for samples assay. A developed equation was derived to correlate the sample age and 235U and 238U activities with the enrichment percentage (E%). The results of the calculated E% by the deduced equation and the target E% values were found to be similar and within 0.58 -1.75% bias in the case of HpGe measurements. The correlation between them was found to be very sharp. The activity was also calculated based on the measured sample count rate and the efficiency at the gamma energies of interest. The correlation between the E% and the 235U activity was estimated and found to be linearly sharp. The results obtained by NaI was found to be less accurate than these obtained by HpGe. The bias in the case of NaI assay was in the range from 6.398% to 22.8% for E% verification

  14. The ''Nuclear-Karlsruhe'' air-filter system

    International Nuclear Information System (INIS)

    Berliner, P.; Ohlmeyer, M.; Stotz, W.

    1976-01-01

    Increasing requirements for exhaust-air filter systems used in nuclear facilities induced the Gesellschaft fuer Kernforschung to develop the ''Nuclear-Karlsruhe'' HEPA filter system. This novel development has profited by experience gained in previous incidents as well as by maitenance and decontamination work performed with different HEPA filter systems. The proved ''Nuclear-Karlsruhe'' system takes equally into account the demands for optimum safety, maximum efficiency and economy, and is distinguished by the following features: (1) The air current is defected by 180 0 in the casing. Deflection causes quite a number of improvements, results in substantial reduction of space requirements, and avoids the dispersion of pollutants to the clean-air side. Besides, the HEPA filter is protected from damage by condensed particles or foreign materials entrained; (2) The ''Nuclear-Karlsruhe'' system allows gas-tight filter replacement. Special replacement collars have been provided at the casing, which allow the tight fastening of replacement bags which are self-locking. (3) In-place testing in the operating condition can be carried out very conveniently because the air is deflected. Minimum leaks in the filter medium or in the filter gasket can be detected by the high-sensitivity visual oil-thread test, which makes leaks distinctly visible as oil mist threads through a transparent front window provided on the clean-air side. The test takes only some minutes and its sensitivity is hardly matched by any other technique. (4) The clamping mechanism is installed outside the casing, i.e. outside the polluted or aggressive media. The contact force is spring-loaded absolutely uniformly to the circular filter gasket. (5) For practical and econmic reasons the filter casings can be locked individually so as to be gas-tight. (6) The entire system is made of stainless or coated steel and metal parts which are corrosion and fire-resistant. (author)

  15. Conjugated Molecules for the Smart Filtering of Intense Radiations

    Directory of Open Access Journals (Sweden)

    Danilo Dini

    2003-04-01

    Full Text Available Abstract: The practical realization of smart optical filters, i.e. devices which change their optical transmission in a suitable way to keep a working state for a general light sensitive element , can involve the use of conjugated molecules whose light absorption properties are light- intensity dependent (nonlinear optical effect. The verification of optical limiting displayed by some particular conjugated molecules, e.g. phthalocyanines, is quite noteworthy and can be successfully exploited for the realization of such smart optical devices. In the present contribution the analysis of the relevant molecular feature of a phthalocyanine are analyzed with the aim of determining useful correlations between optical limiting performance and phthalocyanine chemical structure. In particular , the electronic nature of the substituent is considered as a key factor for the explanation of some observed optical limiting trends.

  16. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  17. A Performance Comparison Between Extended Kalman Filter and Unscented Kalman Filter in Power System Dynamic State Estimation

    DEFF Research Database (Denmark)

    Khazraj, Hesam; Silva, Filipe Miguel Faria da; Bak, Claus Leth

    2016-01-01

    Dynamic State Estimation (DSE) is a critical tool for analysis, monitoring and planning of a power system. The concept of DSE involves designing state estimation with Extended Kalman Filter (EKF) or Unscented Kalman Filter (UKF) methods, which can be used by wide area monitoring to improve......-linear state estimator is developed in MatLab to solve states by applying the unscented Kalman filter (UKF) and Extended Kalman Filter (EKF) algorithm. Finally, a DSE model is built for a 14 bus power system network to evaluate the proposed algorithm for the networks.This article will focus on comparing...

  18. Removing Pathogens Using Nano-Ceramic-Fiber Filters

    Science.gov (United States)

    Tepper, Frederick; Kaledin, Leonid

    2005-01-01

    A nano-aluminum-oxide fiber of only 2 nanometers in diameter was used to develop a ceramic-fiber filter. The fibers are electropositive and, when formulated into a filter material (NanoCeram(TradeMark)), would attract electro-negative particles such as bacteria and viruses. The ability to detect and then remove viruses as well as bacteria is of concern in space cabins since they may be carried onboard by space crews. Moreover, an improved filter was desired that would polish the effluent from condensed moisture and wastewater, producing potable drinking water. A laboratory- size filter was developed that was capable of removing greater than 99.9999 percent of bacteria and virus. Such a removal was achieved at flow rates hundreds of times greater than those through ultraporous membranes that remove particles by sieving. Because the pore size of the new filter was rather large as compared to ultraporous membranes, it was found to be more resistant to clogging. Additionally, a full-size cartridge is being developed that is capable of serving a full space crew. During this ongoing effort, research demonstrated that the filter media was a very efficient adsorbent for DNA (deoxyribonucleic acid), RNA (ribonucleic acid), and endotoxins. Since the adsorption is based on the charge of the macromolecules, there is also a potential for separating proteins and other particulates on the basis of their charge differences. The separation of specific proteins is a major new thrust of biotechnology. The principal application of NanoCeram filters is based on their ability to remove viruses from water. The removal of more than 99.9999 percent of viruses was achieved by a NanoCeram polishing filter added to the effluent of an existing filtration device. NanoCeram is commercially available in laboratory-size filter discs and in the form of a syringe filter. The unique characteristic of the filter can be demonstrated by its ability to remove particulate dyes such as Metanyl yellow. Its

  19. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  20. Model for optimising the execution of anti-spam filters

    Directory of Open Access Journals (Sweden)

    David Ruano-Ordás

    2016-12-01

    Full Text Available During last years, the combination of several filtering techniques for the development of anti-spam systems has gained a enormous popularity. However, although the accuracy achieved by these models has increased considerably, its use has entailed the emergence of new challenges such as the need to reduce the excessive use of computational resources, the increase of filtering speed and the adjustment of the weights used for the combination of several filtering techniques. In order to achieve this goal we have been refined several aspects including: (i the design and development of small technical improvements to increase the overall performance of the filter, (ii application of genetic algorithms to increase filtering accuracy and (iii the use of scheduling algorithms to improve filtering throughput.

  1. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  2. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  3. Experimental study of filter cake formation on different filter media

    International Nuclear Information System (INIS)

    Saleem, M.

    2009-01-01

    Removal of particulate matter from gases generated in the process industry is important for product recovery as well as emission control. Dynamics of filtration plant depend on operating conditions. The models, that predict filter plant behaviour, involve empirical resistance parameters which are usually derived from limited experimental data and are characteristics of the filter media and filter cake (dust deposited on filter medium). Filter cake characteristics are affected by the nature of filter media, process parameters and mode of filter regeneration. Removal of dust particles from air is studied in a pilot scale jet pulsed bag filter facility resembling closely to the industrial filters. Limestone dust and ambient air are used in this study with two widely different filter media. All important parameters like pressure drop, gas flow rate, dust settling, are recorded continuously at 1s interval. The data is processed for estimation of the resistance parameters. The pressure drop rise on test filter media is compared. Results reveal that the surface of filter media has an influence on pressure drop rise (concave pressure drop rise). Similar effect is produced by partially jet pulsed filter surface. Filter behaviour is also simulated using estimated parameters and a simplified model and compared with the experimental results. Distribution of cake area load is therefore an important aspect of jet pulse cleaned bag filter modeling. Mean specific cake resistance remains nearly constant on thoroughly jet pulse cleaned membrane coated filter bags. However, the trend can not be confirmed without independent cake height and density measurements. Thus the results reveal the importance of independent measurements of cake resistance. (author)

  4. Validation of search filters for identifying pediatric studies in PubMed.

    Science.gov (United States)

    Leclercq, Edith; Leeflang, Mariska M G; van Dalen, Elvira C; Kremer, Leontien C M

    2013-03-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity, precision, specificity, accuracy, and number needed to read (NNR). An optimal search filter will have a high sensitivity and high precision with a low NNR. In addition to the PubMed Limits: All Child: 0-18 years filter (in May 2012 renamed to PubMed Filter Child: 0-18 years), 6 search filters for identifying studies including children were identified: 3 developed by Kastner et al, 1 developed by BestBets, one by the Child Health Field, and 1 by the Cochrane Childhood Cancer Group. Three search filters (Cochrane Childhood Cancer Group, Child Health Field, and BestBets) had the highest sensitivity (99.3%, 99.5%, and 99.3%, respectively) but a lower precision (64.5%, 68.4%, and 66.6% respectively) compared with the other search filters. Two Kastner search filters had a high precision (93.0% and 93.7%, respectively) but a low sensitivity (58.5% and 44.8%, respectively). They failed to identify many pediatric studies in our datasets. The search terms responsible for false-positive results in the reference dataset were determined. With these data, we developed a new search filter for identifying studies with children in PubMed with an optimal sensitivity (99.5%) and precision (69.0%). Search filters to identify studies including children either have a low sensitivity or a low precision with a high NNR. A new pediatric search filter with a high sensitivity and a low NNR has been developed. Copyright © 2013 Mosby, Inc. All rights reserved.

  5. Evaluation of Alternative Filter Media for the Rotary Microfilter

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M. R.; Herman, D. T.; Bhave, R.

    2011-11-09

    The Savannah River Site is currently developing and testing several processes to treat high level radioactive liquid waste. Each of these processes has a solid-liquid separation process that limits its throughput. Savannah River National Laboratory researchers identified and tested the rotary microfilter as a technology to increase solid-liquid separation throughput. The authors believe the rotary microfilter throughput can be improved by using a better filter membrane. Previous testing showed that asymmetric filters composed of a ceramic membrane on top of a stainless steel support produced higher filter flux than 100% stainless steel symmetric filters in crossflow filter tests. Savannah River National Laboratory and Oak Ridge National Laboratory are working together to develop asymmetric ceramic ? stainless steel composite filters and asymmetric 100% stainless steel filters to improve the throughput of the rotary microfilter. The Oak Ridge National Laboratory Inorganic Membrane Group fabricated samples of alternative filter membranes. In addition, Savannah River National Laboratory obtained samples of filter membranes from Pall, Porvair, and SpinTek. They tested these samples in a static test cell with feed slurries containing monosodium titanate and simulated sludge.

  6. Evaluation of Alternative Filter Media for the Rotary Microfilter

    International Nuclear Information System (INIS)

    Poirier, M. R.; Herman, D. T.; Bhave, R.

    2011-01-01

    The Savannah River Site is currently developing and testing several processes to treat high level radioactive liquid waste. Each of these processes has a solid-liquid separation process that limits its throughput. Savannah River National Laboratory researchers identified and tested the rotary microfilter as a technology to increase solid-liquid separation throughput. The authors believe the rotary microfilter throughput can be improved by using a better filter membrane. Previous testing showed that asymmetric filters composed of a ceramic membrane on top of a stainless steel support produced higher filter flux than 100% stainless steel symmetric filters in crossflow filter tests. Savannah River National Laboratory and Oak Ridge National Laboratory are working together to develop asymmetric ceramic-stainless steel composite filters and asymmetric 100% stainless steel filters to improve the throughput of the rotary microfilter. The Oak Ridge National Laboratory Inorganic Membrane Group fabricated samples of alternative filter membranes. In addition, Savannah River National Laboratory obtained samples of filter membranes from Pall, Porvair, and SpinTek. They tested these samples in a static test cell with feed slurries containing monosodium titanate and simulated sludge

  7. Model-Based Control of a Nonlinear Aircraft Engine Simulation using an Optimal Tuner Kalman Filter Approach

    Science.gov (United States)

    Connolly, Joseph W.; Csank, Jeffrey Thomas; Chicatelli, Amy; Kilver, Jacob

    2013-01-01

    This paper covers the development of a model-based engine control (MBEC) methodology featuring a self tuning on-board model applied to an aircraft turbofan engine simulation. Here, the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) serves as the MBEC application engine. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC over a wide range of operating points. The on-board model is a piece-wise linear model derived from CMAPSS40k and updated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. Investigations using the MBEC to provide a stall margin limit for the controller protection logic are presented that could provide benefits over a simple acceleration schedule that is currently used in traditional engine control architectures.

  8. MST Filterability Tests

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Burket, P. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-03-12

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). The low filter flux through the ARP has limited the rate at which radioactive liquid waste can be treated. Recent filter flux has averaged approximately 5 gallons per minute (gpm). Salt Batch 6 has had a lower processing rate and required frequent filter cleaning. Savannah River Remediation (SRR) has a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. In addition, at the time the testing started, SRR was assessing the impact of replacing the 0.1 micron filter with a 0.5 micron filter. This report describes testing of MST filterability to investigate the impact of filter pore size and MST particle size on filter flux and testing of filter enhancers to attempt to increase filter flux. The authors constructed a laboratory-scale crossflow filter apparatus with two crossflow filters operating in parallel. One filter was a 0.1 micron Mott sintered SS filter and the other was a 0.5 micron Mott sintered SS filter. The authors also constructed a dead-end filtration apparatus to conduct screening tests with potential filter aids and body feeds, referred to as filter enhancers. The original baseline for ARP was 5.6 M sodium salt solution with a free hydroxide concentration of approximately 1.7 M.3 ARP has been operating with a sodium concentration of approximately 6.4 M and a free hydroxide concentration of approximately 2.5 M. SRNL conducted tests varying the concentration of sodium and free hydroxide to determine whether those changes had a significant effect on filter flux. The feed slurries for the MST filterability tests were composed of simple salts (NaOH, NaNO2, and NaNO3) and MST (0.2 – 4.8 g/L). The feed slurry for the filter enhancer tests contained simulated salt batch 6 supernate, MST, and filter enhancers.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES--PUREM NORTH AMERICA LLC, PMF GREENTEC 1004205.00.0 DIESEL PARTICULATE FILTER

    Science.gov (United States)

    The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...

  10. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  11. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  12. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  13. Advanced Technologies for Design Information Verification

    International Nuclear Information System (INIS)

    Watkins, Michael L.; Sheen, David M.; Rose, Joseph L.; Cumblidge, Stephen E.

    2009-01-01

    This paper discusses several technologies that have the potential to enhance facilities design verification. These approaches have shown promise in addressing the challenges associated with the verification of sub-component geometry and material composition for structures that are not directly accessible for physical inspection. A simple example is a pipe that extends into or through a wall or foundation. Both advanced electromagnetic and acoustic modalities will be discussed. These include advanced radar imaging, transient thermographic imaging, and guided acoustic wave imaging. Examples of current applications are provided. The basic principles and mechanisms of these inspection techniques are presented along with the salient practical features, advantages, and disadvantages of each technique. Other important considerations, such as component geometries, materials, and degree of access are also treated. The importance of, and strategies for, developing valid inspection models are also discussed. Beyond these basic technology adaptation and evaluation issues, important user interface considerations are outlined, along with approaches to quantify the overall performance reliability of the various inspection methods.

  14. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  15. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  16. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  17. A Verification Study on the Loop-Breaking Logic of FTREX

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2008-01-01

    The logical loop problem in fault tree analysis (FTA) has been solved by manually or automatically breaking their circular logics. The breaking of logical loops is one of uncertainty sources in fault tree analyses. A practical method which can verify fault tree analysis results was developed by Choi. The method has the capability to handle logical loop problems. It has been implemented in a FORTRAN program which is called VETA (Verification and Evaluation of fault Tree Analysis results) code. FTREX, a well-known fault tree quantifier developed by KAERI, has an automatic loop-breaking logic. In order to make certain of the correctness of the loop-breaking logic of FTREX, some typical trees with complex loops are developed and applied to this study. This paper presents some verification results of the loop-breaking logic tested by the VETA code

  18. EVALUATION OF ALTERNATIVE FILTER MEDIA FOR THE ROTARY MICROFILTER

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M.; Herman, D.; Bhave, R.

    2011-09-13

    SRS is currently developing and testing several processes to treat high level radioactive liquid waste. These processes include the Integrated Salt Disposition Process (ISDP), the Salt Waste Processing Facility (SWPF), and the Small Column Ion Exchange Process (SCIX). Each of these processes has a solid-liquid separation process that limits its throughput. SRNL researchers identified and tested the rotary microfilter as a technology to increase solid-liquid separation throughput. The testing showed significant improvement in filter flux with the rotary microfilter over the baseline crossflow filter (i.e., 2.5-6.5X during scoping tests, as much as 10X in actual waste tests, and approximately 3X in pilot-scale tests). SRNL received funding from DOE EM-21, and subsequently DOE EM-31 to develop the rotary microfilter for high level radioactive service. The work has included upgrading the rotary microfilter for radioactive service, testing with simulated SRS waste streams, and testing it with simulated Hanford waste streams. While the filtration rate is better than that obtained during testing of crossflow filters, the authors believe the rotary microfilter throughput can be improved by using a better filter membrane. The rotary microfilter membrane is made of stainless steel (Pall PMM050). Previous testing, funded by DOE EM-21, showed that asymmetric filters composed of a ceramic membrane on top of a stainless steel support produced higher filter flux than 100% stainless steel symmetric filters in crossflow filter tests. In that testing, the Pall Accusep and Graver filters produced 13-21% larger filter flux than the baseline 0.1 {micro}m Mott filter. While the improvement in flux is not as dramatic as the improvement of the rotary filter over a crossflow filter, a 13-21% increase could reduce the lifetime of a 30 year process by 4-6 years, with significant cost savings. Subsequent rotary filter testing showed the Pall PMM050 stainless steel filter membrane produced

  19. Behavior of HEPA filters under high humidity airflows

    International Nuclear Information System (INIS)

    Ricketts, C.I.

    1992-10-01

    To help determine and improve the safety margins of High Efficiency Particulate Air (HEPA) filter units in nuclear facilities under possible accident conditions, the structural limits and failure mechanisms of filter in high-humidity airflows were established and the fundamental physical phenomena underlying filter failure or malfunction in humid air were identified. Empirical models for increases in filter pressure drop with time in terms of the relevant airstream parameters were also developed. The weaknesses of currently employed humidity countermeasures used in filter protection are discussed and fundamental explanations for reported filter failures in normal service are given. (orig./DG) [de

  20. Apparatus for Removing Remaining Adhesives of Filter

    International Nuclear Information System (INIS)

    Kang, Il Sik; Kim, Tae Kuk; Hong, Dae Seok; Ji, Young Yong; Ryu, Woo Seog

    2010-01-01

    A Large amount of ventilation filter was used at radiation areas not only in nuclear power plants but also in nuclear facilities. These spent ventilation filters are generated as radioactive waste and composed of a steel frame, glass fiber media and aluminum separator. When treated, the spent filter is separated into filter media for air purification and frame. After separation, while the filter media is collected using steel drum for reducing internal exposure, the filter frame is treated further to remove adhesives for recycling the frame as many as possible in order to reduce waste and cost and improve working conditions. Usually, the adhesives are separated from the filter frame manually. As a result, a lot of time and labor is required. So, the objective of this study is to develop a motor-driven apparatus for removing adhesives efficiently